2022 Western Medical Research Conference | Journal of Investigative Medicine

2023-02-26 15:46:21 By : Mr. Jason Shen

http://dx.doi.org/10.1136/jim-2022-WRMC

If you wish to reuse any or all of this article please use the link below which will take you to the Copyright Clearance Center’s RightsLink service. You will be able to get a quick price and instant permission to reuse the content in many different ways. Ecg Roll Paper Prize

2022 Western Medical Research Conference | Journal of Investigative Medicine

2UC Davis Children’s Hospital, Sacramento, CA

3Adventist Health Lodi Memorial, Lodi, CA

Tobacco use starts young and is the leading cause of preventable disease, disability, and death in the United States. Secondhand smoke increases ear and respiratory infections, asthma attacks, and risk of Sudden Unexpected Infant Death. Few smoking cessation studies in inpatient pediatrics are formal quality improvement projects and most are at academic institutions. We sought to increase smoke exposure screening, smoking cessation education, and referrals in our community hospital pediatric population. By improving screening and documentation, we anticipate increased provider awareness and smoking cessation interventions.

All pediatric ward, newborn nursery, and Level II nursery admissions were eligible. Interventions were education on smoke exposure screening and Helpline referrals, standardizing documentation for screening and discharge instructions, visual reminders , and Helpline wallet cards .

The primary outcome measure was monthly percentage of pediatric inpatients screened for smoke exposure. Secondary outcomes were percentage of pediatric inpatients screening positive for smoke exposure who received discharge instructions or who received a Helpline referral (self or family member). Length of stay (LOS) was monitored as a balancing measure.

Outcome measures were analyzed with statistical process control in SPC for Excel. Baseline and intervention periods for LOS were compared with t-tests.

We increased baseline average smoke exposure screening rates from 14% to 73%, meeting criteria for special cause variation (figure 1). Education on smoke exposure avoidance increased from 5% to 57%. Helpline referrals increased from 0% to 21%. There was no significant change in length of stay.

Vertical lines are timing of interventions. 1) Monthly Pediatrician education started 2) EMR documentation standardized, visual reminders posted 3) Helpline wallet cards available.UCL= Upper Control Limit, Avg= Average, LCL= Lower Control LimitBaseline: Dec 2019-Nov 2020. Intervention: Dec 2020-June 2021

Pediatrician-led smoking cessation interventions are feasible and effective in community hospital pediatric units with no significant impact on length of stay.

Valley Children’s Healthcare, Madera, CA

Pilomatricoma is a rare, skin neoplasm that is often confused with dermoid or brachial cleft cysts. Julian et al., reported that pilomatricomas are commonly misdiagnosed pre-operatively in up to 75% of cases.

We report the case of a child with Turner’s syndrome with a pilomatricoma that was diagnosed on biopsy. We review the histopathologic features and emphasize its association with Turner’s syndrome.

2 year old female with Turner Syndrome presented with a progressive mass above her right upper lip for 6 months.

On exam, she was well-appearing with phenotypical features of Turner’s syndrome. A 0.5 x 0.5 cm erythematous, verrucous, well circumscribed, nontender, mobile lesion was noted above her lip.

She underwent complete excision of the mass without complication. Excisional biopsy revealed the presence of viable basaloid cells and shadow cells confirming the diagnosis of pilomatricoma.

A pilomatricoma, otherwise known as pilomatrixomas, are benign subepidermic tumors of the hair follicle matrix. The lesions occur on the face and neck with a mean age of onset between 5.8- 7 years old.

Lesions are usually asymptomatic but, inflammation and ulceration can occur. The most common clinical presentation is a firm, subcutaneous lesion with an irregular surface. The overlying skin may be red, blue, or display the tent sign. Studies have reported the initial development of pilomatricomas in children with Turner’s syndrome. The exact cause of this is unknown although animal studies suggest a genetic component.

Histopathologically, a pilomatricoma appears as a mass composed of viable basaloid cells, shadow cells, calcification, and ossification. The mainstay of treatment of a pilomatricoma is complete surgical excision as the lesions do not regress spontaneously. Early excision within 12 months of diagnosis is associated with better cosmetic outcomes. Recurrence and malignant transformation is rare.

This case highlights the importance of considering pilomatricoma as a cause of solitary skin nodules, especially when on the head, neck or upper extremities. Additionally, physicians caring for children with Turner syndrome should be aware of the prevalence of pilomatricoma in this population.

1Fresno High School, Fresno, CA

2University of California San Francisco Fresno, Fresno, CA

Fresno High Women ‘s Alliance students continue to collaborate with UCSF Fresno pediatricians to create community action research projects on topics of adolescent concern using a ‘youth as partner’ approach. Given the social isolation and increasing depression students noticed amongst themselves and their peers due to COVID19, the Women’s Alliance teens chose to focus this last year on improving mental health amongst their peers. Mental Health Hopscotch was chosen for its simplicity and ease of use. The fact that it was created by an adolescent in response to the COVID19 pandemic provided further impetus for its use.

Students collaborated with the school’s Social Emotional Wellness and Support team, choosing to do their mental health intervention during National Mental Health Month. They created a Mental Health ‘Sunshine’ at their school entrance, chalking positive affirmations in a sun-shaped diagram for all to see, and chalked a Mental Health Hopscotch on the sidewalk in the front of the school. Silicone bracelets with motivational quotes and mental health awareness pencils, stickers, and mini buttons were handed out to those who completed the Hopscotch. A QR code linked to Google Forms was used to survey students who completed the Hopscotch.

42 students were surveyed. 12% of students reported their average stress level was ‘just right’, 38% reported they could ‘handle’ their stress, 21% felt that they were ‘getting stressed’, 19% reported they were ‘starting to lose it’, and 10% described their stress as ‘getting out of control’. Students primarily dealt with stress by listening to music (31%), exercising (19%), and sleeping (14%). Half of students surveyed reported difficulty sleeping at night, while 90% of students felt that doing Mental Health Hopscotch helped boost their mood.

Although only a limited number of students were surveyed due to restricted numbers of students present on campus, the majority of students felt stressed with half the students reporting difficulty sleeping at night. Mental Health Hopscotch provided a simple, quick, yet no-cost approach to boost students’ mood, thus empowering teens concerned about the mental health of their peers to stage a mental health intervention on their own school grounds.

1The University of British Columbia Faculty of Medicine, Vancouver, BC, Canada

2BC Children’s Hospital, Vancouver, BC, Canada

Daylight Savings Time (DST) is a biannual time change where during ‘spring forward,’ clocks are set forward one hour, potentially resulting in sleep deprivation for much of society. During ‘fall back,’ the opposite occurs. Circadian rhythm disruption has been shown to have effects on cardiovascular, neuropsychiatric, metabolic, immune-related and accidental events in adults. A 2018 study showed increased emergency department (ED) visits for adults in the time after DST. These findings have not been verified in pediatric populations and if extant, may have implications on managing ED patient volume and expectations. We hypothesized that the large-scale sleep deprivation following spring time change would result in increased ED presentations, particularly among certain presentations (neurologic, psychiatric, accidental/traumatic) that may be especially susceptible to sleep deprivation, and that the fall time change would have an opposite effect.

We retrospectively collected and analyzed the primary medical complaint of all children (0–16 years) presenting to BC Children’s Hospital ED in the 2 weeks before and 3 weeks after the biannual DST time changes during 2011 to 2019. Incidence ratios (IR) of ED presentations were calculated over day 0 (day of time change) to day 7. IRs were calculated for all presentations and broken down by specialty.

After excluding infectious presentations, the IR was increased during the first week following spring time change: Monday by 6%, Tuesday by 7% and Wednesday by 6%, though the results were statistical insignificant (p>0.05). There were significant decreases (p<0.05) after the fall time change: Monday by 12%, Tuesday by 13% and Wednesday by 8%.

Following the seasonal time change in the spring, there were increases in IR, though this did not reach statistical significance. There were significant decreases in IR during the three days following the fall shift. Together these findings suggest that the widespread sleep deprivation at spring time change results in adverse health effects among children, while extra sleep in the fall time change may be protective.

Circadian rhythm disruption from DST in the children shows potentially important effects on emergency visits and further study can lead to better patient care and ED preparation. This may be informative in developing policy regarding the need for DST.

Charles Drew University of Medicine and Science, Los Angeles, CA

Adults with a history of adverse childhood experiences (ACEs) are at increased risk for chronic disease and, thus, poor health outcomes. Yet, the effect of chronic disease awareness on health outcomes in adults with ACEs has not been examined. The objective of this study was to determine the relationship between general health status and chronic disease, access to care, and awareness of chronic condition among adults with history of ACEs.

Data from the 2019 Behavioral Risk Factor Surveillance System were analyzed. Descriptive statistics were used to determine the prevalence of ACEs, chronic disease, healthcare access, chronic disease awareness, general health status, and population characteristics. Bivariate analysis using Chi-squared test was performed on history of ACEs by all independent variables and general health status by all independent variables. Multivariable logistics regression was used to determine the relationship between general health status and ACEs, adjusting for chronic disease, healthcare access, chronic disease awareness, and demographics.

Of the 78,112 respondents, 63% reported being exposed to at least one ACE. History of ACEs was associated with high prevalence of chronic diseases (p<0.002), lower healthcare coverage (p<0.0001), lower chronic disease awareness (p<0.006), and report of fair/poor general health status (p<0.0001). In adjusted analyses, adults with 2, 3, and ≥4 ACE events had 1.24, 1.22, and 1.45 times increased odds of reporting fair/poor health status compared to those reporting no ACEs.

Adults with history of ACEs face barriers to achieving good health. There is a need to expand ACEs screening in primary settings so that early intervention can improve general health outcomes.

Loma Linda University School of Medicine, Loma Linda, CA

The prevalence of overweight adolescents has increased dramatically over the last decade. Although previous research has demonstrated the contribution of diet, exercise habits, and parenting on obesity rates, this early study addresses the relationship between hospitalization, social vulnerability index, social determinants of health, and childhood obesity.

We studied children between the ages of 5 and 17.9 years, seen in inpatient (n=39) and outpatient (n=35) settings at healthcare facilities affiliated with Loma Linda University Health between January 2020 through June 2021. We collected Body Mass Index scores (BMI) and demographic information. A standardized questionnaire gathered social determinants of health information available in the electronic health record. Using home addresses, we identified the Social Vulnerability Index score (SVI) (Flanagan, 2011) associated with their census tract (Center for Disease Control, 2018). The SVI ranks each tract on four main themes: socioeconomic status, household composition and disability, minority status and language, and housing and transportation.

The most significant comparison were the differences in risk between the inpatient (n=39) and outpatient (n=35) populations. Inpatients were more likely to experience Social Connection risk (p=0.046), Financial risk (p=0.0006), and Food Insecurity (p=0.0077) than outpatients.

When comparing between patients living in the upper quartile (n=40) and the lower three quartiles (n=34) of SVI, there was little difference in BMI, Financial risk, Food Insecurity risk, Transportation risk, Physical Activity risk and Stress risk. There was similarly little difference when comparing age groups of younger versus older children.

BMI ≤ 85th percentile = normal weight or underweight, BMI > 85th percentile = overweight or obese (Kuczmarski, 2002). *Hisp = person of Cuban, Mexican, Puerto Rican, South/Central American, or other culture regardless of race (US Census Bureau, 2011).

There are significant disparities in risk between pediatric inpatient and outpatient populations. Steps could be taken to identify inpatients with reduced resources to improve food insecurity, social connection risk, and financial risk. Although SVI is important for understanding the context of each patient, every family has unique social determinants and risks that could be addressed by physicians.

1Midwestern University Arizona College of Osteopathic Medicine, Glendale, AZ

3Arizona State University, Phoenix, AZ

4Phoenix Children’s Hospital, Phoenix, AZ

Social status and food insecurity (FiS) may contribute to health disparities among youth. This study aimed to evaluate whether subjective social status (SSS) and FiS are associated with weight-specific quality of life (wQoL) among Latino youth with obesity. We further explored whether relationships differed by sex.

One-hundred and forty-one Latino youth (47% male; mean age: 15.3±0.9) with obesity completed surveys to assess SSS, perceived FiS, and wQoL (self, social, and environment). Separate linear regression models were performed to examine the relationship between SSS & FiS with wQoL, after controlling for sex and BMI percentile. Data were then stratified by sex to determine Pearson’s r for wQoL & SSS and wQoL & FiS.

Mean total (64.1±24.9), self (57.3±29.3), social (69.5±25.6), and environmental (60.5±26.3) wQoL, with males reporting higher total, self, and environmental wQoL as compared to females (all p<0.05) after controlling for BMI. Over one-third of the cohort indicated very low food insecurity (36.2%) or marginal food insecurity (34%), which did not differ by sex. Despite no sex differences in SSS society (mean diff=-0.074, p=0.77) or SSS school (mean diff=-0.354, p=0.28), there was a positive relationship between SSS school and all scales of wQoL regardless of sex (all p<0.01). SSS school was a significant predictor of total wQoL (β=4.24, p<0.001) and the self (β=3.49, p=0.008), social (β=4.40, p<0.001), and environment (β=4.57, p<0.001) subscales. SSS school explained 9% of the variance in total, social, and environment wQoL, and 4% of the variance in self wQoL. There was an inverse relationship between FiS and all scales of wQoL, particularly for those experiencing marginal and high FiS (all p<0.03). This correlation was stronger for males for all wQoL scales at all levels of FiS except high FiS. Marginal FiS was a significant predictor of total (β=-12.94, p=0.006), self (β=-12.48, p=0.029), social (β=-12.06, p=0.013), and environment (β=-14.66, p=0.003) wQoL after controlling for sex and BMI percentile. High FiS was a significant predictor of total (β=-22.46, p<0.001), self (β=-18.56, p=0.016), social (β=-24.16, p<0.001), and environment (β=-22.76, p<0.001) wQoL after controlling for sex and BMI percentile. Marginal and high FiS explained 8% of the variance in total, social, environmental wQoL, and 3% of the variance in self wQoL.

Among Latino youth with obesity, social status is associated with wQoL while food insecurity is inversely associated with wQoL.

University of Nevada Reno School of Medicine, Reno, NV

The prevalence of pediatric obesity continues to increase. In Nevada, approximately 40% of youth are overweight or obese and 70% of these children will remain overweight as adults. Traditional medical school curriculum does not adequately prepare students to counsel families on this subject. The purpose of this study was to evaluate the progression of third year medical student knowledge base involved in managing obesity in children and adolescents over the course of the pediatric clinical curriculum.

63 medical students in the third year clinical clerkship curriculum at the University of Nevada, Reno School of Medicine were given a survey to evaluate their knowledge on pediatric obesity and perceptions surrounding treatment before a six-week pediatric clerkship. During the clerkship, students received instruction on the diagnosis and treatment of obesity. At the conclusion of the clerkship, students took a post-survey to assess knowledge gained during the experience. Paired samples T-test and Chi-square tests were used to assess differences between pre- and post- surveys.

After the clerkship, there was an increase in the mean score in knowledge and comfort level in recommending a treatment program for overweight/obese children between pre- (M=1.97, SD=0.91) and post-tests (M=3.55, SD=0.89) (t(62)=10.25, p<0.0001). There was also an increase in the mean score in their ability to effectively counsel overweight/obese children between pre- (M=2.40, SD=0.90) and post-tests (M=3.86, SD=0.77) (t(62)=10.33, p<0.0001). In addition, there was an increase in students believing in the overall efficacy of counseling in the treatment for overweight/obese children and adults between the pre- (M=2.91, SD=0.75) and post-tests (M=3.34, SD=0.76) (t(62)=3.23, p=0.0019).

The curriculum improved student knowledge and understanding of pediatric obesity. In particular, perceived comfort and ability to counsel patients and families about obesity prevention and treatment increased. As obesity continues to be a challenge, expansion of medical student education in this area is imperative to address this problem.

1The George Washington University School of Medicine and Health Sciences, Washington, DC

2Cedars-Sinai Smidt Heart Institute, Los Angeles, CA

The ongoing COVID-19 pandemic has brought considerable challenges to heart transplantation. Most notably, towards the start of the pandemic, changes in the outpatient care of post-transplant patients were brought forward at our center to further prevent the spread of the virus and protect highly immunosuppressed recipients. For example, blood draws for immunosuppression adjustments were conducted at home and early coronary angiograms were eliminated. Video visits were conducted for post-operative months 7, 9, and 11. Despite these changes in management, formal analyses on the impact of these changes on heart transplant recipient outcomes have yet to be conducted. Now over one year removed from the pandemic’s onset, we sought to examine if the modifications in outpatient care impacted 1-year outcomes of patients transplanted during the start of the COVID-19 pandemic.

Between March 6 and September 1, 2020, we assessed 50 heart transplant patients transplanted during the COVID-19 pandemic. These patients were compared to patients who were transplanted during the same months between 2011 and 2019 (n=482). Endpoints included subsequent 1-year survival, 1-year freedom from cardiac allograft vasculopathy (CAV: stenosis ≥ 30%), 1-year freedom from any-treated rejection, 1-year freedom from acute cellular rejection, 1-year freedom from antibody-mediated rejection, hospital and ICU length of stay, and 1-year freedom from non-fatal major adverse cardiac events (NF-MACE: MI, new CHF, PCI, ICD/pacemaker, or stroke).

Patients transplanted during the COVID-19 pandemic had similar outcomes compared to those of patients transplanted in years prior to the pandemic. There were no significant differences in hospital and ICU length of stay between the two groups. There were also no significant differences in 1-year survival, 1-year freedom from CAV, 1-year freedom from any treated rejection, and 1-year freedom from acute cellular or antibody mediated rejection between both groups. Patients transplanted during the pandemic had a significantly higher 1-year freedom from NF-MACE.

Despite necessary changes being made to post-transplant care to mitigate the spread of COVID-19 and protect an immunosuppressed population, heart transplantation during the COVID-19 pandemic appears safe with 1-year outcomes comparable to years prior.

Cedars-Sinai Smidt Heart Institute, Los Angeles, CA

The COVID pandemic affected how our medical staff treated our heart transplant (HTx) patients during this period of time. Patients were seen virtually via telemedicine and patients self-isolated at home. What we do not know is the impact of this treatment during the COVID pandemic on HTx outcomes. As patients were self-isolating, it is possible that medication and medical compliance were increased and there may have been a decrease in non-COVID infection rates as exposure was minimized due to patients self-isolating. None of these factors have been assessed prior and, thus, we reviewed our large, single center patient population for this study.

Between March 2020 and September 2020, we assessed 55 HTx patients who were transplanted during this period of time and followed for 6 months. Patients were self-isolating and had every other clinic visit changed to a virtual visit to minimize exposure to COVID. Endpoints for this study included 6-month survival, re-hospitalization, number of non-COVID infections (defined as the need for intravenous antibiotics), any treated rejection (ATR), and maintenance of therapeutic immunosuppressive blood levels. The study patients were then compared to a control group of the previous three years, averaging each year at the same time points of March 13, 2017, March 13, 2018, and March 13, 2019, and followed for 6 months. Each group was averaged and then compared to the study group.

The study group (during the COVID pandemic) demonstrated a significant decrease in re-hospitalization in the first 6 months following HTx compared to the control group. There was a numerical decrease in non-COVID infectious complications. There was no difference in survival and freedom from any-treated rejection episodes between the two groups. Reasons for rehospitalization included infections, various cardiac and renal issues, malaise, and fever.

The COVID pandemic demonstrated that self-isolation and virtual visits resulted in fewer hospitalizations possibly due to less infectious complications. This implies that perhaps stricter restrictions for community exposure might benefit HTx patients in the 6 months following transplantation.

1University of California Los Angeles, Los Angeles, CA

2Cedars-Sinai Smidt Heart Institute, Los Angeles, CA

There are many reports in organ transplantation that demonstrate that there are sex discrepancies in waitlist urgent status, time on the heart transplant waitlist, waitlist mortality. There are no differences in men versus women in terms of heart disease and in terms of mortality for these two sexes. It would be expected that both men and women would have similar percentages as urgent status, especially after the new donor heart allocation policy change took place in October 2018. We chose to assess our male and female patients to establish whether there exists a difference in patients listed as urgent status on the HTx waitlist.

Between November 2018 and December 2020 (after donor heart allocation change in October 2018), we assessed 276 patients on the HTx waitlist. Patients were followed for 6 months and censored after they were transplanted or removed from the waitlist. Percent of patients of each sex listed as urgent status (status 1, 2, 3) was recorded. Mortality on the waitlist, waitlist time, and removal from the waitlist due to being too sick were secondary endpoints.

After the donor heart allocation policy change in October 2018, women were significantly less likely to be listed as urgent status compared to men. For those patients listed as urgent status, there was no significant difference in mortality for women versus men on the HTx waitlist. The waitlist time was shorter for men compared to women (see table 1).

There appears to be a sex disparity for women being less likely to be listed as urgent status on the HTx waitlist. Further studies are needed to determine whether this difference has a biologic mechanism or whether there is selection bias and/or treatment bias present in their care.

1University of California Los Angeles, Los Angeles, CA

2Cedars-Sinai Smidt Heart Institute, Los Angeles, CA

In heart transplantation (HTx), donor-to-recipient size matching has been done by height and weight. More recently, predicted heart mass (PHM) has been found to be more clinically useful to reflect outcome. Using PHM, it has been demonstrated that under-sizing a donor heart for a larger recipient with high pulmonary artery pressures leads to increased mortality. It has recently been noted in the International Society for Heart and Lung Transplantation (ISHLT) registry that there may be increased risk in placing an oversized donor heart using weight into a smaller recipient. This clinical outcome has not been established using PHM. We sought to address this question in our large, single center experience using PHM.

Between January 2010 and June 2020, we assessed 588 donor-to-recipient donor heart matches. We used PHM to assess whether there were outcome differences when the donor hearts were oversized. We divided the donor-to-recipient PHM ratio into two categories: normal (90–110%, n=524), and markedly oversized (greater than 140%, n=64). Outcomes included 1-year survival, freedom from 1-year rejection (acute cellular rejection [ACR], antibody-mediated rejection [AMR]), freedom from cardiac allograft vasculopathy (CAV: stenosis ≥30%), freedom from cardiac dysfunction (defined as LVEF ≤40%), and freedom from non-fatal major adverse cardiac events (NF-MACE: myocardial infarction, new congestive heart failure, percutaneous coronary intervention, implantable cardioverter defibrillator/pacemaker implant, stroke).

Markedly oversized donor hearts using PHM compared to normal matching showed no difference in 1-year survival, freedom from 1-year ACR, freedom from CAV, freedom from NF-MACE, and freedom from cardiac dysfunction. There was a significantly lower 1-year freedom from AMR in the markedly oversized donor heart group which is due to more women recipients (sensitized due to previous pregnancies) in this group (71% oversized group vs. 20% normal group).

Markedly oversized donor-to-recipient matching using PHM does not result in poor outcomes after heart transplantation. This has potential to expand the donor pool, particularly for smaller patients.

1The University of Arizona College of Medicine Tucson, Tucson, AZ

2The University of Arizona Sarver Heart Center, Tucson, AZ

3Banner University Medical Center Tucson, Tucson, AZ

4Northwestern University Feinberg School of Medicine, Chicago, IL

The AHA 17-segment model is the preferred clinical method to define LV MI size in CMR imaging, although can be subjective. We propose a novel measurement technique based on long-axis (LAX) CMR from a porcine model of MI to improve accuracy and reproducibility of infarct volume quantification. Data were collected from MRI exams and endpoint organ harvesting for histopathologic analysis.

Yucatan mini swine were subjected to 90-minute ischemia/reperfusion of the left anterior descending (LAD) coronary artery. Six-months after infarction, two observers evaluated four infarct sizing methods: myocardial contouring of post-mortem heart slices, contouring using cardiac MRI, AHA 17-segment model analysis and novel LAX MRI infarct sizing.

Novel LAX MRI infarct sizing was done in routine 2-, 3- and 4-Chamber LAX LGE MRIs. The precise length of MI could be measured in each plane. These measurements provide exact anatomic location of MI along each wall of the heart relative to the apex.

Total heart length (THL) from apex to base is measured in the 2-Chamber LAX MRI to standardize processing and provide a more accurate LV infarct percentage, since not every heart is the exact same size and infarct percentage is calculated relative to specific anatomy.

Infarct length measurements from each of the 3 LAX planes are summated and divided by THL to produce a Long-Axis Ratio (LAR) value. LAR values were plotted against MRI contoured infarct size to produce a table that approximates LV infarct percentage using the LAR value.

LV infarct sizes ranges were 1.6% - 25.8% (n=10) of the left ventricle using reference standard histopathologic infarct sizing. Intraclass correlations (ICC) were calculated between two observers and averaged due to high similarity, ICC > .900. A t-test of .0006 and Bland-Altman plots show statistically significant differences in 17-segment model infarct size compared to histopathologic analysis while no significant difference was found when compared to our novel method with 0.8198. Linear correlation showed an R2 of 0.9111 between MRI contoured infarct size and our novel MRI infarct sizing model to predict infarct size as a percentage while the R2 of the 17-Seg model is 0.8197. A representative MRi from a patient is shown was produced to demonstrate the clinical relevance of this approach.

AHA 17-segment model provides inferior assessment of LV infarct size compared to proposed LAX infarct sizing suggesting it maybe a robust and easily implementable quantitative assessment of LV infarct size in advanced imaging.

With over 7,000 rare diseases affecting 1 in 10 Americans, longitudinal data describing the clinical course of rare diseases is essential to understand disease natural history and prepare for studies of novel treatments. Traditionally, published data associated with these diseases is limited to cross-sectional times of diagnosis and major events making inferences on the progression and trajectory of various phenotypes difficult. A prime example is Danon Disease, a rare genetic cardiac disease with a malignant outcome of death or need for heart transplantation in most males.

To mitigate the lack of longitudinal natural history knowledge in Danon patients, a retrospective clinical database was developed using the REDCap database program. All current and past medical history for Danon patients enrolled internationally was collected and entered into the database by two research sites. To date, this registry has enrolled over 100 patients, with roughly equal representation of males and females. This includes the collection of data on over 550 echos.

To demonstrate the power of this natural history study, echocardiogram data were collected over time and examined for trends in ejection fraction (EF) of the patients’ hearts over the age during that time of their illness. Data on the eight patients with the largest number of longitudinal EF data points, prior to transplant, were extracted and graphed. The data revealed that EF drops off much earlier, at almost half the age in males compared to females and that substantial variation in EF is present for subjects, rather than a smooth, gradual decline.

As this natural history data continues to be analyzed, further analysis will be done that aims to look at the progression of heart disease in Danon patients prior to transplant. These trends will be imperative in not only understanding the progression of the disease to drive best clinical practices, but also to utilize as controls in clinical trials for potential treatments of Danon Disease.

1Loma Linda University School of Medicine, Loma Linda, CA

2VA Loma Linda Healthcare System, Loma Linda, CA

Lifestyle counseling has shown to be effective in modifying health behaviors and reducing cardiovascular risk in healthy patients. However, data supporting effectiveness of lifestyle counseling in patients with symptomatic heart failure is limited. This quality improvement study hypothesizes that lifestyle counseling focusing on sleep, activity, nutrition, medication adherence, and self-care will be associated with improvements in health behaviors in veterans with heart failure.

This study screened a selected cohort of patients with symptomatic heart failure from the Loma Linda VA Heart Failure program. The study included 5 counseling sessions over a 9 week interval. In week 1, pre-intervention surveys were completed using the following validated surveys: Pittsburgh Sleep Quality Index, Veterans Specific Activity Questionnaire, Mini Nutritional Assessment, Eight-Item Morisky Medication Adherence Scale, and Self-Care of Heart Failure Index. During weeks 2 through 8, three counseling sessions focusing on nutrition, sleep, exercise, and self-care were tailored to each patient following best practices from LLVA Heart Failure Program. In week 9, post-intervention surveys were completed.

Out of 112 screened, 49 (44%) agreed to participate. Each patient was given a baseline score during the pre-intervention surveys during week 1. Baseline Pittsburgh Sleep Quality Index average score was 8.74 with post-intervention survey average of 7.68. Lower scores on the Pittsburgh Sleep Quality Index demonstrate higher quality sleep habits and sleep quality. For the Veterans Specific Activity Questionnaire baseline average score was 4.36 and post-intervention was 4.55. This questionnaire ranked activities according to metabolic equivalent of task from 1 to 13. At baseline in the Mini Nutritional Assessment patients reported an average score of 11.38. After lifestyle counseling, patients reported an average score of 12.40. The Mini Nutritional Assessment scoring system had a maximum of 14 with normal nutritional status ranging from 12 to 14, at risk of malnutrition from 8 to 11, and malnourished scores with score under 7. The Eight-Item Morisky Medication Adherence Scale had baseline and post-interventions average scores of 5.51 and 6.30, respectively.

All survey data indicated positive changes in lifestyle from pre to post surveys. Lifestyle counseling may improve health behaviors in patients with symptomatic heart failure.

The University of Arizona College of Medicine Tucson, Tucson, AZ

To review the literature on stroke risk in patients with atrial fibrillation (AF) undergoing electroconvulsive therapy (ECT) as well as anticoagulation recommendations.

Two authors independently performed a literature review of PubMed, searching for ‘electroconvulsive therapy and atrial fibrillation.’ The resulting articles and their references were reviewed for relevance to AF and stroke risk.

Rozig, et al. (2018) found that ECT is not associated with an increased risk of new or recurrent stroke. Among 23 studies, we found that post-ECT cardioversion of AF to normal sinus rhythm occurred in 2 cases. Neither was associated with stroke, though this finding may limited by low sample size. However, ECT has induced AF in at least 6 cases without stroke.

ECT requires fewer joules than does cardioversion for atrial fibrillation, and current is applied to the brain for ECT versus the heart for synchronized cardioversion. The mechanism of AF induction/cardioversion is a catecholamine surge and varied hemodynamic changes.

Because of the rarity of stroke in ECT patients, routine anticoagulation prior to ECT is controversial. Furthermore, direct electrical stimulation of the brain risks hemorrhagic stroke after ECT.

In patients with existing AF who are not anticoagulated, an alternative means of reducing cardioversion risk may involve modulation of post-ECT hemodynamic changes by beta-blockade. Beta blockade may exacerbate bradycardia in some while reducing reflex tachycardia in others and may even decrease in seizure length, thus, lowering the efficacy of ECT.

The risk of inducing stroke by cardioversion of AF in the setting ECT is very low despite a documented risk of cardioversion. Routine anticoagulation prior to ECT remains controversial. Imaging modalities such as echocardiography or mitigation of hemodynamic effects may further reduce the risk of stroke in these patients. Areas for further study are precise assessment of cardioversion and stroke risk in AF patients and the effect of routine beta-blockade on cardioversion risk.

Western University of Health Sciences, Pomona, CA

Coronary artery disease is the leading cause of death worldwide with over 17.9 million deaths per year according to WHO. There has been a vast amount of research done in understanding the causes, and therefore the treatments of this disease. Our body’s inflammatory processes have been identified as a nidus of the elaborate process that ultimately leads to life-threatening cardiovascular events. However, research around understanding how the body puts an end to such naturally occurring inflammation i.e., resolution of this inflammation, is gaining traction and has shed light into new avenues for future management of CV diseases. In this narrative review we discuss the pathophysiological and molecular mechanisms of atherosclerosis including inflammation, apoptosis and efferocytosis, the recent development in the understanding of a new class of molecules called Specialized Pro-resolving Mediators (SPMs), and the impact of such findings in the realm of cardiovascular treatment options.

We searched the MEDLINE database, PubMed restricting ourselves to original research articles as much as possible, and analyzed papers published in the last 20 years on the complex pathophysiology of atherosclerosis and the role of resolvins.

Specialized Pro-resolving Mediators (SPMs) is a class of molecules that acts as strong local modulators of acute inflammation. It is further classified into resolvins (E and D series), maresins, protectins, and lipoxins. Resolvins mediate resolution of inflammation through a variety of actions. Some of them are reduced chemotaxis of neutrophils by blocking the action of LTB4, a strong neutrophil chemoattractant, reduced PMN chemotaxis by effecting changes in their actin polymerization, downregulation of leukocyte integrin activation thereby reducing their response to platelet activation factor (PAF), a potent pro-inflammatory cytokine. Maresins are involved in converting pro-inflammatory phenotype of macrophage M1 to pro-resolving M2, in reducing superoxide production by TNFα and nuclear translocation of p65, which together result in a reduction of pro-inflammatory NFκB pathway. Protectins act in a pro-resolving manner by downregulating certain markers for chemotaxis such as Vascular Cell Adhesion Molecule (VCAM-1) and Monocyte Chemoattractant Protein (MCP-1). Lipoxins are shown to facilitate resolution by stopping further recruitment of neutrophils, inducing nonphlogistic migration and induction of macrophages to clear apoptotic neutrophils.

We expect to see further research in translating these findings to bedside clinical trials in the treatment of conditions with a pathophysiological basis of inflammation such as coronary artery disease, asthma, periodontal disease, etc.

The University of New Mexico, Albuquerque, NM

To assess risk factors and the prevalence of coronary artery disease (CAD) using coronary artery calcium (CAC) scoring in transgender individuals receiving gender affirming hormone therapy (GAHT).

Transgender individuals are treated with cross sex hormone therapy. These hormones alter metabolic profiles and may be associated with risk factors for CAD. Little data is available on the atherosclerotic vascular risk due to GAHT in transgender individuals. CAC Scores have been validated as a noninvasive method to assess risk for cardiovascular events in the general population.

This is a pilot study assessing feasibility of obtaining baseline risk profiles and CAC scores in transgender patients over the age of 18, in order to establish a baseline risk assessment and the prevalence of CAD in this population. Patients with risk factors other than smoking and family history were excluded. Difference in CAC Scores were compared to those in The Coronary Artery Risk Development in Young Adults (CARDIA) study. Baseline characteristics were compared using t-tests.

Out of 25 transwomen recruited, 24 completed CAC. 3/24 (12.5%) had CAC >0. One had CAC >100. Out of 22 transmen recruited, 16 completed CAC. 2/16 (12.5%) had CAC >0. None had CAC >100. CAC scores did not correlate with the presence of risk factors. Patient data in table 1.

Overall, 12.5% of transgender people on GAHT had positive CAC scores. This is similar to the findings of the CARDIA study where the prevalence of positive CAC scores in the general population was 11.7%. There was a higher percentage of current smokers in transwomen who also had higher triglycerides, but a lower LDL. Transmen had a higher family history of CAD, percentage having ever smoked, and a higher hsCRP. This cross-sectional study shows that obtaining CAC scores in transgender patients is feasible and that the prevalence of positive CAC scores appears to be similar to that of the general population. A larger, longitudinal study will be performed to expand on these findings.

Washington State University Elson S Floyd College of Medicine, Spokane, WA

Human Sex Hormone Binding Globulin (SHBG) is a homodimeric glycoprotein encoded by the SHBG gene on chromosome 17p13.1. The gene is predominantly expressed in the liver and SHBG is secreted into the blood whereas, in murine species where much of our understanding comes from, the ortholog of SHBG is expressed primarily in Sertoli cells. SHBG binds both steroidal and nonsteroidal ligands with high affinities and has been traditionally viewed as functioning as a reservoir and means of transport for steroids in serum. SHBG has also been implicated in androgen uptake by cells and as a signaling molecule with a cell surface receptor on target cells. More recently a germ cell-specific form has been identified in the human testis, raising questions about a role for SHBG in germ cell function and fertilization. The objective of this review was to determine aspects of SHBG function that require further investigation.

A systematic search of the literature, using NCBI and other databases compiling peer-reviewed publications, was conducted using keywords: Sex Hormone Binding Globulin; SHBG; SHBG Receptor; SHBG Isoform; SHBG Expression; Megalin; Androgen Binding Protein; ABP. Additional papers were found via citations, both forward and back. The focus of the review was on literature published in the last 25 years, prioritizing the most recent research. Studies were excluded where SHBG function, structure, genetic regulation was not a topic of investigation, such as studies focused on genome-wide analysis, SNP’s, and SHBG in the context of an upstream pathology.

In total 95 publications were identified and kept for producing either new experimental evidence or analysis of prior data that provided new insights into the role of SHBG on reproduction. The literature search converged on studies investigating SHBG’s steroid-binding properties, protein-protein and protein-receptor interactions, novel germ cell forms of SHBG, and clinical findings from both normal and natural mutant phenotypes. The findings from these studies were summarized and gaps in our knowledge of SHBG’s role in reproductive function were identified

Four unanswered questions regarding SHBG/ABP function emerged from review of the literature. 1) Is SHBG essential in the development and regulation of the reproductive system? 2) What is the identity of the cell surface receptor for SHBG? A number of studies have demonstrated specific binding and subsequent signaling, yet no receptor has been identified. 3) Does the site of SHBG synthesis and secretion matter? 4) What is the function of the germ cell form of SHBG? Germ cells have the second-highest level of expression of SHBG, but we know very little about it. These questions seem key to future research on the role of SHBG in reproduction.

1The University of New Mexico, Albuquerque, NM

2University of California Los Angeles Health System, Montecito, CA

Meningiomas account for 13% of CNS tumors during pregnancy and are predominantly low grade WHO I tumors. Case series have reported rapid growth of these tumors during pregnancy, particularly during the second and third trimester, likely due to factors including intra-tumoral hypervascularity and tumor growth due to high sex hormone levels and the presence of progesterone receptors on tumor cells. As a result, patients can present with neurologic and visual field (VF) deficits and in rare cases, herniation and coma. We report a case of a pregnant patient with worsening of VF deficits and panhypopituitarism due to a meningioma in the suprasellar and sellar region. A 36-year-old woman G4P3 at 20 weeks, was transfered to our facility for neurosurgical intervention for acute worsening of VF deficits. She initially presented to her ophthalmologist with a 4-month history of headaches and right-sided vision loss and was found to have a sellar mass on imaging. She then began to have nausea and vomiting, weakness, and orthostasis which necessitated admission. The patient was given i.v. hydrocortisone (HC) due to a suspicion for adrenal insufficiency with improvement in symptoms. However, after receiving steroids, she developed polydipsia and polyuria. On arrival to our facility, the patient was noted to have urinary output (UO) of up to 6 liters per day with a low urine osmolality of 53 mosm/Kg (50–600 mosm/Kg) and serum Na as high as 147 mmol/L (134–144 mmol/L). She was given desmopressin (DDAVP) with improvement in serum Na and UO. Biochemical testing also revealed secondary hypothyroidism and L-thyroxine (L-T4) was initiated. An MRI of the brain on admission showed a 3.6 cm lobulated suprasellar and sellar mass with mass effect on the optic chiasm. Formal VF testing showed complete VF loss on the right and temporal loss on the left. After review of the patient’s case by a multidisciplinary team consisting of Neurosurgery, MFM, and Endocrinology it was decided to proceed with surgery. The patient underwent left pterional craniotomy and resection of the suprasellar component of the mass with decompression of the optic chiasm. Pathology showed a meningioma, WHO Grade I. Post-operatively, VF deficits improved and on discharge from the hospital, she was continued on DDAVP, L-T4 and HC. This is a rare case of a meningioma resulting in hormonal deficiencies in the anterior and posterior pituitary along with VF deficits which likely developed due to the location and rapid growth of the tumor in the setting of pregnancy. The decision for surgery during pregnancy should be based on clinical presentation and should involve a multidisciplinary team to determine the best management that limits complications to the mother and fetus.

University of New Mexico Health Sciences Center, Albuquerque, NM

In the setting of panhypopituitarism, hypothyroidism develops as a result of TSH deficiency rather than a primary insult to the thyroid gland. We describe a case of a patient with acquired panhypopituitarism who developed severe thyrotoxicosis, which persisted despite discontinuation of replacement thyroid hormone.

A 24-year-old Hispanic female with panhypopituitarism following a craniopharyngioma resection in 2010 presented with severe thyrotoxicosis. She initially presented with anxiety and palpitations and was observed to have a mild free T4 (FT4) elevation, which prompted a reduction in her replacement dose of levothyroxine. With progression of her FT4 elevation, the levothyroxine was discontinued and she was ultimately referred to adult endocrinology when clinical presentation worsened. Upon further evaluation, she reported weight loss of more than thirty pounds, palpitations, insomnia, fatigue, muscle weakness, bilateral hand tremors, nausea, vomiting, and dizziness. She confirmed a period of several months without levothyroxine. Heart rate was 109 bpm, no proptosis was noted, thyroid examination revealed no thyromegaly, no tremors were appreciated, and there were no findings suggestive of heart failure. FT4 at that time was 7.6 ng/dL (reference 0.7–1.6 ng/dL). Thyroid peroxidase antibody and TSH receptor antibody levels were negative. Thyroid ultrasound demonstrated an atrophic left thyroid lobe; no nodules were identified. Thyroid uptake was low at 1.8% at 24 hours. This unexpected finding prompted a thyroglobulin (TG) level to distinguish between subacute thyroiditis or an ectopic source of thyroid tissue versus a low TG state such as factitious thyrotoxicosis. TG was low at 3.3 ng/ml, prompting a pharmacy query, which revealed that she was continuing to fill levothyroxine prescriptions at three pharmacies.

A more comprehensive review of her behavioral health history revealed persistent depressive disorder, PTSD, psychogenic nonepileptic seizures, and numerous recent psychosocial stressors. We approached her care in a non-confrontative manner by presenting a range of possible explanations for her clinical and laboratory findings and sharing our recommended treatment.

This case highlights an unexpected presentation of factitious thyrotoxicosis secondary to surreptitious use of levothyroxine in a patient with acquired panhypopituitarism, and the stepwise evaluation that led to this conclusion. We discuss the strategies implemented in managing this patient and review approaches to patients with factitious disorders.

1Children’s Hospital Colorado, Aurora, CO

2University of Colorado Anschutz Medical Campus, Aurora, CO

Insufficient sleep duration is common among adolescents and may contribute to insulin resistance, dysglycemia, and precursors to type 2 diabetes (T2D). Youth-onset T2D has devasting long-term effects, and thus prevention strategies for adolescents at risk for T2D are needed. Increasing total sleep time (TST) may be one such strategy. We tested the feasibility of a 4-week sleep extension intervention for adolescents treated within an outpatient weight management clinic.

High school students aged 14–19 years with insufficient sleep (<8 hours) on school nights were recruited during the academic year. Exclusion criteria included T2D, medications that affect sleep, and a schedule that precluded participants from adhering to sleep extension (e.g., night shift employment). Following sleep monitoring at home for 1 week, a revised sleep schedule was collaboratively created with a target of increasing time in bed (TIB) by 2 hours. Participants followed this schedule for 2 weeks (W2), returned to the clinic to discuss barriers to adherence to the prescribed schedule, and then followed the revised schedule for another two weeks (W4). Feasibility was assessed by adherence to wearing sleep watch and submitting sleep diaries, improvement in TIB and TST, and self-reported barriers to following prescribed sleep schedule. Data are reported as averages ± sd or medians (min, max).

A total of 6 participants have been recruited for the study to-date; however, 2 were withdrawn due to COVID-19 school closures when shifted to online learning. Participants (n=4) were aged 17.0 ± 0 years, 100% Hispanic, with a BMI percentile of 98.4±0.9 kg/m2. All participants completed the intervention; 99.4% adhered to Actigraphy and 65.6% to the sleep diary. At baseline, the average TIB was 7.3±1.6 hours and TST was 5.8 ±1.5 hours. Compared to baseline, W2 hours of TIB [1.2 (-0.1,2.6)] and TST [1.2 (-0.4,2.1)] increased and W4 hours of TIB [0.1 (-1.8,5.3)] and TST [0.7 (-1.5,4.1)] increased. Increased TIB was primarily achieved by shifting bedtimes earlier, while waketimes remained relatively consistent. Barriers to adherence included homework, extracurricular activities, and parents’ and youths’ variable work schedules.

Initial findings indicate a 4-week sleep extension intervention in adolescents with short sleep seeking treatment for weight management is feasible. Participants increased TIB and TST by a median of 1.2 hours. However, additional strategies are needed to maintain such improvements. School and community efforts to delay high school start times may benefit youth at risk for T2D by enabling them to increase TST.

Charles Drew University of Medicine and Science, Los Angeles, CA

Quality of sleep depends on diet as carbohydrate or vitamin D. Sleep quality is challenging to study due to cost, resources, and availability of research subjects. There is a gap in the literature examining the impact of carbohydrates and vitamin D intakes on sleep quality. We aim to examine the association between carbohydrate and vitamin D intakes and sleep quality among adult US population.

We analyzed data from the National Health and Nutrition Examination Survey 2007–2014. Carbohydrate and vitamin D intakes were categorized into three groups based on the distribution. Sleep quality was assessed using hours of sleep, have sleep problem and doctor diagnosed sleep disorder. We used chi square and multiple logistic regression to analyze the data considering the design and sample weight.

Of the 16,415 adults, 35% had high carbohydrate intake (>283 gm), 31% had low vitamin D intake, 36% slept <7 hours/day, 27% reported sleep problem, and 9% had sleep disorder. There was no relation between the high carbohydrate intake and low vitamin D intake and the hours of sleep (p>0.05). However, high vitamin D intake was associated with sleep disorders and troubled sleep (p<0.05) adjusting for the confounding variables. Minority, divorced/widowed, overweight/obese, smokers, with diabetes, kidney disease, depression were more likely to have low sleep hours relative to the other groups (p<0.05).

Our study indicated no association between carbohydrate and vitamin D intakes and sleep quality. Longitudinal prospective studies are needed to examine factors associated with quality of sleep and their mechanisms.

Kern Medical Center, Bakersfield, CA

To present an interesting case with an atypical presentation of a rare disease.

A single patient case report was conducted after IRB approval.

Autoimmune hepatitis (AIH) is a chronic inflammatory disease of the liver that typically presents with the presence of AIH-related antibodies. There are two types of AIH: type 1 is associated with anti-smooth muscle antibody (anti-SMA) and type 2 associated with anti-liver/kidney microsome type 1 antibody or anti-liver cytosol type 1 antibody. About 25% of patients with AIH are asymptomatic. Rarely, patients with AIH present with features of fulminant hepatic failure with rapidly progressive liver impairment, coagulopathy, and hepatic encephalopathy or coma.

We present a case of a 63-year-old female who presents to the hospital with altered mental status (AMS) of one day duration. Her labs were significant for acute kidney injury (AKI), elevated liver function tests, and urinalysis consistent with urinary tract infection (UTI). Computerized tomography (CT) abdomen and pelvis showed bilateral pyelonephritis. Urine cultures grew pan-sensitive Escherichia coli, and she was treated with 10 days of Ceftriaxone. During this course, renal function worsened with uremic-range BUN and she became oliguric, thus hemodialysis was initiated with good response in both renal indices and mental status. Patient was also found to have coagulopathy with a worsening PT/INR of 32.2/3.24. Further workup of AKI revealed proteinuria and positive atypical p-ANCA and anti-SMA, raising concern for autoimmune disease in both kidneys and liver.

Our patient’s symptoms originally raised concerns for pyelonephritis secondary to a UTI. Despite a full course of antibiotics, her kidney function continued to decline until receiving dialysis. She was also determined to have fulminant hepatitis with coagulopathy followed by positive autoimmune antibodies. This course leads us to believe that the etiology of her encephalopathy was secondary to autoimmune liver and kidney disease.

The main interest of this case report lies in autoimmune hepatitis secondary to atypical P-ANCA and anti-SMA presenting as fulminant hepatic failure in the setting of AKI. Atypical P-ANCA seems to be more specific for autoimmune hepatitis than the typical P-ANCA [1]. Anti-SMA antibodies found in about 50% of type 1 AIH [1]. Both nephrology and gastroenterology recommended renal/liver biopsy for definitive diagnosis which will be done outpatient.

1University of Colorado, Denver, CO

2Beth Israel Deaconess Medical Center, Boston, MA

3Jacobi Medical Center, Bronx, NY

A 52-year-old man with poorly controlled type 2 diabetes presented with four months of watery diarrhea. During this period, he also noticed an unintentional 80-pound weight loss, a ‘burning’ sensation on his anterior thighs, and new onset depression and anxiety. He had no recent history of fever, chills, or night sweats. He was not prescribed insulin, though glipizide was added to his diabetes regimen about six months prior. Physical examination revealed a cachectic man with a BMI of 18.4 kg/m2, normal vital signs, and an unremarkable rest of exam. Laboratory evaluation was notable for a white blood cell count of 21.8 x 10^3 cells/mm3, an anion gap of 26, and a glucose level of 280 mg/dL. He also had a CRP of 147.1 mg/dL and an HbA1C of 19.3%. Serology for HIV-1, HIV-2, HBsAg, and HCV were all negative, and a fecal fat test was normal. Chest x-ray was clear, a transthoracic echocardiogram showed no valvular vegetations, and CT scans of his chest, abdomen, and pelvis were completely normal.

This patient’s concerning cluster of symptoms provoked an extensive workup to rule out cancer, infection, and malabsorptive syndromes. However, this patient’s presentation matched an uncommon neuropathy syndrome found in diabetes. Diabetic neuropathic cachexia (DNC) is an extremely rare neuropathy – only 36 cases have been reported in the literature – but its unique cluster of symptoms often incites a search for a hidden malignancy or insidious infection. DNC most commonly presents in a type-2 diabetic after initiation of an oral anti-hyperglycemic medication with profound weight loss, mood symptoms, symmetric peripheral neuropathy, and painful limb paresthesias. Management of this neuropathy is directed at improving glycemic control, as most patients recover within one to two years with improvement in A1C, though some can suffer residual neurologic deficits.

Texas Tech University Health Sciences Center, Lubbock, TX

A 20 year old male with history of drug abuse and osteochondritis presented to the emergency department with bilateral leg weakness and feet numbness since the morning when he woke up unable to move his legs or his back and fell out of bed, after which he noticed that he had feeling in his legs but not his feet. He was found to be profoundly hypokalemic at 1.6 mmol/L and hypomagnesemic at 1.5 mg/dL. Phosphate levels were not taken but calcium was 9.0 mg/dL. These values were verified by a redraw and repeat chemistry. The patient denied nausea, vomiting, and diuretic use, but did report lose bowel movements for long months due to self-diagnosed IBS. He also reported alcohol use of 4–6 beers 1–2 times a week for years. The patient was admitted for potassium and magnesium intravenous replacement. Chest X Ray, MRI of the cervical spine, CT scan of the brain, and drug panel were all negative.

On day 1 of hospital stay the patient’s electrolyte levels improved, with a potassium of 3.6 mmol/L a magnesium of 1.8 mg/dL, a phosphate of 4.0 mg/dL, and a calcium of 8.9 mg/dL. On day 2, he reported improved strength after working with physical therapy but did not feel at baseline. His labs indicated a potassium of 3.6 mmol/L, a magnesium of 1.8 mg/dL a phosphate of 5.0 mg/dL and a calcium of 9.3 mg/dL. His SARS-CoV2 antigen, blood, and urine cultures were all negative. A negative TTG IgA test ruled out Celiac disease.

On day 3, potassium was 4.4 mmol/L, and magnesium was 1.7 mg/dL, and a newly elevated phosphate of 5.7 mg/dL was noted. Calcium was 9.0 mg/dL. Right arm edema and pain at the PICC line site prompted removal of the line but an ultrasound showed an occlusive DVT. A heparin drip was ordered, and a V/Q scan was negative. An asymptomatic run of ventricular tachycardia prompted an echocardiogram and troponin evaluation, but both were negative.

On day 4 his labs revealed a low TSH of <0.01 an hypomagnesemia of 1.8 mg/dL. Potassium was within normal limits. Phosphate was still elevated at 5.5 mg/dL and calcium was 10.0 mg/dL.

On day 5 a low TSH of<0.01 was confirmed once again, elevated T3 9.99 pg/ml and T4 2.86 ng/dL were also noted. Potassium at this time was 4 mmol/L, magnesium was within normal limits, phosphate was elevated at 7.0 mg/dL, and calcium was 9.5 mg/dL. A PTH was also measured at this time and was found to be normal at 36.7 pg/mL. Ventricular tachycardia was attributed to hyperthyroidism, and a diagnosis of thyrotoxic periodic paralysis was made. An ultrasound of the neck showed a hypervascular thyroid consistent with Graves’ disease or thyroiditis. Thyroid stimulating immunoglobulin was drawn, and the patient began a regimen of 10 mg methimazole 3 times daily.

On day 6 the patient had a potassium of 3.8 mmol/L, phosphate of 6.2 mg/dL, calcium of 9.6 mg/dL and a magnesium of 2.3 mg/dL. On day 7 the patient was discharged.

2University of North Carolina System, Chapel Hill, NC

Knowing of the success of lifestyle changes in the Diabetes Prevention Program, we wondered if similar results could be achieved in a workplace setting with the collaboration of an employer in the Charlotte, NC region.

We compared diabetes incidence and serum lipoprotein concentrations in two groups. Cohort 1 comprised 504 workers (mean age 49, 88% males). Cohort 2 comprised 131 workers (age 52, 85% males). Cohort 1 received health coaching by a physician assistant or nurse practitioner who encouraged regular exercise, healthy weight, carbohydrate limitation, smoking abstinence, and blood pressure control. They compared baseline values of HbA1c with those associated with prediabetes and diabetes, > 5.7 and > 6.5, respectively. Cohort 1 workers also received up to $800 per year, based on the above lifestyle choices as well as HbA1c and lipoprotein levels. The latter were also measured in Cohort 2, but no incentives or health coaching were provided, other than individual letters containing their blood test results.

Diabetes developed over 10 years in 59 Cohort 1 participants compared to 93 expected (chi-squared = 8.56, p = 0.003) on the basis of initial HbA1c values (Zhang X et al., Diabetes Care 2010;33:1665). Workers with prediabetes decreased from 192 to 141. In Cohort 2, five participants were diagnosed with diabetes, the same number as expected. However, workers with prediabetes increased from 28 to 36.

Serum lipoproteins improved in both cohorts (table 1).

Preventing diabetes benefits individual workers and their families. Employers who share in the health care costs of their workforce also stand to benefit substantially from diabetes prevention, as these costs are $9,601 per year higher in persons with this condition. Health coaching and monetary incentives were associated with improved glycemic control as well as lower lipoproteins. Only the latter improvement was found in the group of workers not provided either of the above interventions in this retrospective analysis. Further prospective observations may identify the respective roles of coaching and monetary incentives. Whether these improvements are associated with better outcomes in cardiovascular end points in these workers would also be of great importance.

Initial and follow-up biometrics in workers with (Cohort 1) and without (Cohort 2) health coaching

1Western University of Health Sciences, Pomona, CA

2University of California Los Angeles, Los Angeles, CA

Intimate Partner Violence (IPV) is a public health crisis that impacts 25% of women and 10% of men in the US, totaling 43 million women and 38 million men. IPV screenings traditionally occurred at doctor offices, which was to be a safe space, but with the transition to telemedicine, screening is done at home. This study sought to identify potential educational and practice gaps in care surrounding IPV screenings in different settings. We aimed to understand medical students’ general experiences (personal and professional), attitudes, and perceptions of IPV screening.

To assess student experiences with IPV screenings as both patients and clinicians-in-training, two separate IRB-approved surveys were created and beta-tested for Western U COMP/COMPNW medical students. One for those who had clinical rotations and the other for those who had not. Questions included personal experiences with in-person and telemedicine IPV screenings, how screenings were conducted, and their perceived importance. Fourth-year students were asked additional questions regarding their experiences observing patients being screened. We used descriptive analysis of the responses to determine the frequency of IPV screening and the modalities in which IPV screening was conducted in in-person and telemedicine environments.

170 students participated in the study for a response rate of 13%: 140 students from the non-clinical cohort and 30 students from the clinical cohort. Overall, 36.9% of students, who had been seen for an in-person appointment, reported they had been screened for IPV, while only 12.5% of students seen in telemedicine appointments were screened. Among those with in-person appointments, screening was by written survey (31.7%), online survey (10%), and via verbal screen (58.3%). However, among those seen via telemedicine, screening was conducted by online survey (30%) and by verbal screening (70%). Healthcare staff who administered IPV screens shifted from medical assistants (MAs) (35.6%), doctors (35.6%) and nurses (26.7%) for in-person screenings to MAs (28.6%), doctors (28.6%) and receptionists (28.6%) in the telemedicine screenings. In the clinical cohort, 56% of students observed in-person IPV screenings during rotations, compared to 8.33% of students with telemedicine experience who observed IPV screenings via telemedicine.

Medical students reported IPV screening was decreased in the telemedicine setting, which could increase the risk of under detection. Virtual screening was more frequently done verbally and conducted by less trained personnel. This study is unique because it provides the perspective of medical students as both patients and healthcare providers in training and demonstrates both educational and practice gaps in this new environment.

1Western University of Health Sciences, Pomona, CA

2Harbor-UCLA Medical Center, Torrance, CA

3University of California Los Angeles David Geffen School of Medicine, Los Angeles, CA

Despite the recommendations of The American College of Obstetrics and Gynecology, and the health benefits that administering the T-dap vaccine can have on prenatal patients and their newborns, nearly 45% of prenatal patients do not receive the vaccine. We surveyed English and Spanish speaking prenatal patients, to measure uptake and assess patient knowledge and attitudes of the Tdap vaccine. We compared the responses of English and Spanish speaking patients to identify if inconsistencies existed.

Using an IRB approved protocol, we surveyed low-income patients at Harbor UCLA Medical Center Obstetrics clinic in Torrance, CA for 7 weeks in Summer 2021. Patients were at least 18 years of age and > 32 weeks gestational age. Upon obtaining verbal consent, we administered a 31-question beta-tested survey in their preferred language, English or Spanish.

The response rate achieved by the survey was 97%. There was a total of 98 participants, 80 of which answered the survey in English and 18 in Spanish. 49% of subjects were Hispanic/Latino, 35% African American, 3% Caucasian and 13% were of other ethnicities. Most of our patients (67%) were between the ages of 20–30. 69% of English-speaking patients received the vaccine and 79% of Spanish speaking patients also received the vaccine. Among all our 98 participants there was an overall Tdap vaccine uptake of 70%. The most common reason for refusal of the Tdap vaccine among English speaking patients was due to safety concerns for their baby (47%), concerns for themselves (47%) or because they believed that they did not need the vaccination (47%). While, only 4% of English-speaking patients denied the vaccine because they were unaware they required it, this was the most common reason for refusal of the vaccine by Spanish speaking patients (66%).

Overall, 70% of patients received the Tdap, higher than national averages, but still short of the goal, leaving substantial numbers of women and their newborns unprotected. Both uptake and reasons for refusal of Tdap vaccine differed among English and Spanish speaking patients. This may suggest a need to look more closely at the how Tdap vaccine information is presented in both languages to ensure that the same information or information more relevant to different groups, is being relayed to all prenatal patients.

UC Davis School of Medicine, Sacramento, CA

Interdisciplinary research has shown palliative care improves patient experience by addressing adverse symptoms and psychosocial needs during complex and life-limiting illness. The benefits of early inclusion of palliative care in medical management extends to caregivers, providers, and health systems. Additionally, the integrated interprofessional model of palliative care equips it for addressing health disparities for populations carrying a disproportionate burden of illness. Rural communities are among the systemically marginalized and underserved, and as such are poised to benefit from the advancement of palliative care. While prior research assessed challenges faced by rural palliative care providers, an analysis of the scope and equity of palliative care services for Northern California’s rural underserved has not been undertaken. Further, this study identifies areas of underservice, analyzes corresponding population demographics, and underlines opportunities for addressing specific health disparities.

This research included the collection and analysis of publicly available data pertaining to populations and palliative care organizations of Northern California. Population data sources included the Health Resources and Services Administration and the United States Census Bureau. Palliative care organization data sources included the Center to Advance Palliative Care and the National Hospice and Palliative Care Organization. Data were correlated to identify statistically significant factors pertaining to palliative care access, as well as benchmarks for determining underserved status within the region.

Statistically significant correlations with key population, organizational, and access stratification data identified 13 factors found to be consistent with a county being underserved relative to palliative care access in the region. Several of these factors also represent risk for a county remaining underserved due to economic and organizational infrastructure deficits. Based on these factors, 19 of the 29 counties included in this study qualified as underserved and are at risk for remaining underserved. Additionally, strong evidence exists that protective factors enhancing palliative care access include increased population diversity across race, sex, and age groups. Significant opportunities were identified for expanding the scope, access, and equity of palliative care services based on these findings.

Our analysis provides a data-driven approach to improving the access and equity of palliative care services in rural Northern California. A collaborative, community-based, approach to the early inclusion of palliative care in medical management has the possibility of significantly improving health outcomes for medically underserved populations. Additionally, understanding how increased population diversity acts as a protective factor is worthy of further study.

1Washington State University Elson S Floyd College of Medicine, Spokane, WA

2Naval Medical Center San Diego, San Diego, CA

Traumatic brain injury (TBI), is a common injury amongst veterans who have served in Iraq and Afghanistan. With the number of veterans from these conflicts now approaching 3 million, it is estimated that approximately 20% have suffered at least one TBI. Beyond the structural trauma, TBI may also lead to transient, or even permanent, pituitary insufficiency. Of particular interest are the consequences of TBI-induced hypogonadotropic-hypogonadism (HG) on short and long-term health. TBI clearly presents economic implications for the nation, related both to direct medical expenses and indirect costs. The purpose of this review is to summarize the knowledge about post-TBI hypopituitarism, its screening and treatment recommendations, costs, with a special focus on the potential impacts of post-traumatic hypogonadism on naval special operators.

We utilized DoD, CDC, and NIH datasets on TBI’s, alongside Endocrine Society and AUA Guidelines, and finally NCBI and Google Scholar searches of the following key terms: pituitary dysfunction, traumatic brain injury, hypogonadotropic hypogonadism, hypogonadism treatment, TBI screening, hormone therapy, fertility, special operations, special forces. The aforementioned sources were used to roughly predict prevalence of post-TBI Hypogonadism in special operators, it’s potential costs and consequences, and carry over its standards for screening and treatment.

According to recent estimates, the range of TBI amongst all servicemen serving in Iraq or Afghanistan, taken together from 2000 to 2016, runs from 11–23%. Persistent hypogonadotropic hypogonadism following TBI meanwhile, amongst the general population, is predicted to fall within the wide range of 8–41%. It was found that hypogonadism was associated with incidence of PTSD among other physiologic consequences such as sexual dysfunction, osteoporosis, and neurodegeneration. Recent literature has proposed post-TBI pituitary dysfunction screening and therapy among the general public, but little for post-deployment special forces or veterans specifically.

While we can crudely postulate from comparisons to other groups, there is a distinct lack of recent data on TBI’s in naval special forces. But, what is clear is the connection between TBI’s and subsequent pituitary dysfunction, in which hypogonadotropic hypogonadism is likely the second most common subtype. Taken together, further retrospective and prospective studies are needed to further investigate hypogonadotropic pituitary dysfunction after TBI’s acquired in the line of duty by naval special forces servicemen with the purpose of establishing screening guidelines for these servicemen and ultimately provide appropriate treatment algorithms for the preservation of their quality of life, fertility, and protection against comorbid disease.

1The University of British Columbia Faculty of Arts, Vancouver, BC, Canada

2BC Children’s Hospital, Vancouver, BC, Canada

3The University of British Columbia Faculty of Medicine, Vancouver, BC, Canada

4The University of British Columbia Faculty of Dentistry, Vancouver, BC, Canada

Management of cleft lip and palate (CLP) is complex and multidisciplinary, and can last until adulthood. This study aims to describe the healthcare utilization burden of care (BoC) for the management of patients with non-syndromic CLP by identifying a provider burden, characterizing an interaction burden, and calculating an economic burden associated with their health system interactions.

This study is designed as a retrospective chart review between January 1, 1999 and April 30, 2021 of patients with non-syndromic CLP cared for at British Columbia (BC) Children’s Hospital. Healthcare utilization data for inpatient, outpatient, and emergency encounters were extracted from paper and electronic health records. Community outpatient data were obtained from affiliated specialists. Bottom-up micro-costing was utilized for hospital costing, our tariff guide was utilized for provider reimbursement, and zip code was utilized to calculate patient costs.

Our results indicate that 58 patients identified with cleft lip and palate had a mean of 156.4 healthcare interactions (consults/follow-ups/surgeries) between the ages of 0–18 years. The distribution of healthcare interactions was 92.4% outpatient, 7.3% inpatient, and 0.3% emergency. Patients had a mean of 11.4 surgical procedures, for which the primary services are plastic surgery (5.6), surgical ENT (2.6), oral surgery (1.2) and dentistry (1.2). The remainder consisted of emergency room visits (0.5) and interactions with outpatient speciality services including orthodontics (78.2), plastic surgery (14.7), and ENT (13.5) most frequently. Costing data will ultimately be provided at the physician, patient and system levels.

Patients born with non-syndromic CLP have a high frequency of healthcare encounters, suggesting a substantial BoC. These findings will inform parents, and motivate the development of more efficient healthcare systems to maximize patient access to care by right sizing resources (provider, support infrastructure, property/plant/equipment).

1University of Washington School of Medicine, Seattle, WA

2University of Wyoming, Laramie, WY

3Harborview Medical Center, Seattle, WA

High rates of thyroid nodules in the population often require fine needle aspiration (FNA) to rule out thyroid cancer. Competency guidelines for residents and fellows to properly perform FNAs are lacking, leading to these essential skills being taught on the job. Simulation based trainings offered by professional societies are effective but often require travel and are expensive. We created a brief hands-on module designed to be 1–2 hours in length to introduce trainees to basic thyroid ultra-sound (US) and US-guided FNA to improve trainees’ comfort with the procedures. This study evaluated whether participating in this module improves the comfort of resident and fellows with thyroid US and FNA prior to performing the procedure on patients while being cost- and time-effective.

A hands-on training module for US-guided FNAs was developed and offered yearly for 6 years to residents and fellows at Harborview Medical Center. The models used were purchased from Northwestern Medical Center for $25 each and one model was used for each session. 40 pre-surveys and 26 post-surveys were collected directly before and after the module. Participants were primarily otolaryngology residents (n= 15, 11) and endocrinology fellows (n=13, 8) with varying experience. The surveys assessed their comfort level performing US-guided FNAs on a scale of 1 to 5, with 5 being most comfortable and able to perform independent of supervision. The surveys also assessed their comfort with interpreting thyroid US, and long- and short-axis US-guided FNA. Effectiveness was assessed by calculating the change in comfort using population averages, regardless of specialty on the post-survey compared with the baseline level. Significance was determined using a permutation test.

On average, participants’ comfort with US-guided FNA increased by 1.19 (p=0.0006) on the 1–5 scale, comfort level with long- and short-axis US-guided FNA increased by 1.54 and 1.51, and comfort with interpreting thyroid US improved by 0.97. For endocrinology fellows and otolatyngology residents specifically, their change in comfort level for performing US-guided FNA increased by 1.02 and 1.17.

Overall, residents and fellows showed an improvement in comfort level after completion of the module. This improvement was not only evident in performing US-guided FNA but also with FNA technique and US interpretation. Although comfort does not equivalate skill, at $25 per session our module is a promising alternative to costly and time-consuming simulations courses, which often cost upwards of $700 per individual. These training modules can be executed in most residency and fellowship training programs to provide accessible training of these important skills.

1University of Utah Health, Salt Lake City, UT

2Michigan State University, Grand Rapids, MI

3Helen DeVos Children’s Hospital, Grand Rapids, MI

4Spectrum Health Medical Group, Grand Rapids, MI

5Neonatal Associates, PHC, Grand Rapids, MI

6Pediatric Surgeons of West Michigan, PC, Grand Rapids, MI

Patients admitted to the pediatric or neonatal intensive care units (PICU or NICU) at Helen DeVos Children’s Hospital in Grand Rapids, MI, prior to their surgery are taken directly from the PICU or NICU to the operating room (OR). Therefore, these patients do not undergo routine pre-operative (pre-op) checklists in the pre-op holding area. A critical care pediatric patient underwent a wrong-sided surgery, highlighting the need for a standardized approach to improve completion of the pre-op checklist and communication between the ICU and surgery teams in the perioperative period.

Using quality improvement methodology, the NICU, PICU, pediatric surgery, and pediatric hospital medicine teams completed an A3 form and performed a gap analysis. To address the concerns identified in the root cause analysis, we developed a bedside team huddle composed of ICU, surgery, and anesthesia teams to be performed in the ICU prior to the patient being taken to the OR. We created the acronym iSTOP to outline the components of the pre-op huddle: (i) introductions; (S) surgical procedure to be performed; (T) any tubes, lines, or drains; (O) ongoing plan/intra-operative plan; (P) post-operative care and pain management plan.

Over the course of 90 days, 24 pre-op bedside iSTOP huddles were convened for ICU patients requiring surgery. All team members were present and all key elements of iSTOP were reviewed in over 90% of instances. Surgical site was appropriately marked 100% of the time, and pre-op checklist was completed at least 80% of the time. During this time period, there were zero serious safety events for ICU-to-OR patients.

The iSTOP huddle improved completion rate of the pre-op checklist and enhanced care team communication and patient safety surrounding care transitions between ICU and surgical departments. This huddle format can be extended to incorporate other bedside procedures within the ICUs and other areas of the hospital.

1University of Washington School of Medicine, Seattle, WA

2Virginia Mason Medical Center, Seattle, WA

Adult spinal deformity surgery is associated with high rates of perioperative adverse events (AE). To minimize the risk of AEs, patients must undergo a multitude of various labs, imaging, procedures, and evaluations before surgery. This process can be complicated for both patients and providers, which can lead to surgical delays. To address this problem, Virginia Mason Neuroscience Institute created a comprehensive preoperative checklist, detailing all necessary aspects of surgical optimization. The goal of this study was to evaluate the impact of a comprehensive preoperative checklist on surgical delays in patients undergoing adult spinal deformity surgery. We hypothesized that checklist-directed optimization would reduce the number of surgical delays and need for postoperative intensive care.

Appointed members of the complex spine surgery team were tasked with coordinating surgical optimization using a checklist from 9/1/20 to 8/1/21 (n = 51). Complex spine surgeries (to treat adult spinal deformity) between 1/1/18–8/31/20 (n = 142) were not medically optimized via checklist and thus served as a historical control. Indications for surgery including infection, tumor, and urgent/emergent cases were excluded. Surgeries that were delayed due to COVID, or those that deviated from the established care pathway were also excluded. Impact of the checklist on the frequency of pre-/peri-operative delays and need for postoperative intensive care were investigated. Chi-square analysis was used to interpret these data.

Of 235 patients scheduled for complex spine surgery, 193 met our criteria. Checklist-directed surgical optimization did not significantly reduce surgical delays, with 19.0% of surgeries experiencing a delay in the historical control group compared to 15.7% in the study group (p = 0.38). However, patients in the study group were less likely to require postoperative intensive care (11.1%) compared to the control group (25.3%) (p = 0.031).

Checklist directed pre-surgical optimization was instituted at a single, high-volume spine surgery center. Although this intervention did not reduce the number of surgical delays, it does have the potential to increase patient safety, as use of the checklist was associated with reduced need for postoperative intensive care. Further research on ways to improve interdisciplinary coordination for preoperative optimization to reduce surgical delays is needed to maximize patient safety and minimize AEs.

1University of Washington School of Medicine, Seattle, WA

2Montana State University Bozeman, Bozeman, MT

Innate and adaptive immune responses may play a role in severe complications of SARS-CoV-2 infection (COVID-19). The formation of virus-antibody immune complexes may result in aberrant activation of innate immune cells, including circulating monocytes. Thromboembolic complications are a hallmark of severe COVID-19, in which a hypercoagulable state is observed. Currently, the mechanism of this is poorly understood. Tissue factor, also known as Coagulation Factor III, is key to activating the clotting cascade; it is constitutively expressed extra-vascularly but can be upregulated on circulating monocytes during inflammation. Conditions that predispose patients to severe COVID-19, such as metabolic syndrome, are associated with elevated plasma levels of endotoxin. We postulate that aberrant inflammatory activation of monocytes via SARS-CoV-2/antibody immune complexes, in tandem with endotoxin, can upregulate tissue factor and induce hypercoagulability.

Immune complexes were formed by mixing inactivated SARS-CoV-2 with Bamlanivimab (Bam), a therapeutic monoclonal antibody specific for the spike receptor-binding domain of SARS-CoV-2. Antibodies to a different domain of the spike were used to capture immune complexes, which were then detected using biotin/avidin. Effects on monocyte cell-surface expression of tissue factor were investigated using flow cytometry. Human peripheral blood mononuclear cells were cultured with SARS-CoV-2, Bam, endotoxin, and combinations of the three. Monocytes were identified by forward/side scatter and CD14 expression.

SARS-CoV-2 immune complexes were readily detectable by immunoassay. Immune complexes were also stable under different storage conditions. These complexes increased endotoxin-induced tissue factor expression on monocytes to a greater degree than did endotoxin alone. Incubation with neither Bam nor SARS-CoV-2 alone induced tissue factor expression.

Antibody-mediated mechanisms are key in clearing SARS-CoV-2 infection. Our results show that the formation of virus-antibody immune complexes may also result in aberrant activation of innate immune cells, including circulating monocytes, leading to tissue factor upregulation. These results may aid in understanding the hypercoagulable state seen in SARS-CoV-2 infection. The next step is to evaluate the effect of immune complexes on in-vitro coagulation by using tissue factor-induced Factor Xa activity assays.

1University of Washington School of Medicine, Seattle, WA

2Fred Hutchinson Cancer Research Center, Seattle, WA

Breast milk is essential to the health and development of children. In addition to nutrients, breastmilk contains immune-modulating factors, including antibodies that protect infants from infections. However, breastfeeding is not available for all women and children. While infant formula is designed to meet growing infants’ basic nutrition needs, it does not contain factors like antibodies. We discovered that in the absence of breastmilk antibodies, mice develop increased levels of CD4 T follicular helper (Tfh) cells and germinal center (GC) B cells in gut draining lymphoid tissues. However, in germ-free mice that were also deficient in maternal antibodies (matAbs), the Tfh and GC B cell levels closely reflected antibody-sufficient neonates’ levels. These results suggest the significance of the microbiota in the role of Tfh and GC B cell responses. Tfh cells are essential for maintaining host-microbe homeostasis, and dysregulated increases in Tfh cells can alter the microbiota composition, potentially causing colitis. An indispensable role of the microbiota is to prevent the spread of pathogenic infections through colonization resistance. We hypothesize that the increase in Tfh and GC B cells in pups lacking maternal antibodies will alter the intestinal microbiota and its function.

To understand how the Tfh cells can alter the intestinal microbiota, we treated half of the mice that are sufficient and deficient in maternal antibodies with anti-ICOSL, which effectively dampens the Tfh cell expression. We then infected all the mice with Salmonella typhimurium, a bacterial pathogen. Fecal samples were collected daily for 5 days. On day 5 post-infection, fecal samples, cecum, and liver were harvested to determine infection burden.

Our data indicates no significant difference in colony-forming units (CFU) between the matAb sufficient and deficient groups, signifying that the presence of maternal antibodies does not change the susceptibility to S. typhimurium infection. In addition, there was no difference seen between the groups treated with anti-ICOSL versus control, indicating that early life Tfh cells giving rise to GC B cells producing T-dependent antibodies do not play a role in conferring differential resistance to S. typhimurium. This trend was seen when examining both localized and systemic infection across all three organs.

Although neonates that do not receive matAbs in breastmilk have increased numbers of Tfh and GC B cells which have the potential to produce antibodies and change the microbiota composition of the gastrointestinal tract, there was no difference in S.typhimurium infectivity between mice transiently devoid of Tfh cells versus control. To continue exploring the role of colonization resistance, further research is needed to determine how Tfh cells influence host-microbe interactions and subsequently its change in infection susceptibility.

1University of Washington School of Medicine, Seattle, WA

2The McGinley Clinic, Casper, WY

3WWAMI Medical Education Program, Laramie, WY

4University of Washington School of Medicine, Seattle, WA

Bone marrow aspirate concentrate (BMAC), along with conservative patient management, offers a minimally invasive option in treating chronic pain from knee osteoarthritis. Knee osteoarthritis affects 35% of adults aged 65 years and older. BMAC has been shown to decrease inflammation and improve cartilage signal on MRI. We hypothesize BMAC injections, along with conservative care, will provide short- and long-term relief of pain associated with knee osteoarthritis.

A retrospective chart review was conducted to identify patients with knee osteoarthritis who received BMAC injections and conservative care in our clinic from November 2013 to November 2019. Under CT and ultrasound guidance, 60cc of bone marrow was aspirated from the posterior iliac crest. Each 60cc sample of aspirate was centrifuged, concentrated to 10cc, and injected into the knee joint under sonographic guidance. Patients were non-weight bearing utilizing crutches and a compartment specific off-loading brace for 3 weeks after the procedure and partial weight bearing with just the brace for an additional 3 weeks. All NSAIDs were held 10 days prior and 3 months following the procedure. A 0–10 patient self-reported pain scale was used as the primary outcome. Secondary outcomes included adverse events and additional treatments. Pain scores were collected on the day of treatment and fixed timepoints up to 3 years post-injection. A 2-tailed Wilcoxon signed rank test with a .05 alpha level was used to determine statistical significance between differences in reported pain level at each follow-up compared to baseline.

Forty-seven patients (71 knees, 26 males, 21 females) with an average age 64±9 years, received BMAC injections and conservative management. These patients were followed for 3 years post treatment (mean follow-up 30.1±11.0 months). Reported pain level was significantly reduced 3 weeks post-injection compared to baseline (47 patients, 71 knees; mean Δ -2.0 points; p<.001; table 1). Pain continued to decrease up to 3 years post-injection compared to baseline (37 patients, 54 knees; mean Δ -3.9 points; p<.001). No adverse events were reported. Thirteen patients (17 knees) subsequently received additional treatments including injections (10 patients, 12 knees). Three patients (5 knees) underwent knee arthroplasty. Zero patients underwent repeat BMAC injections during the 3-year follow-up period.

BMAC injections, along with conservative management, represents a safe, effective, and minimally invasive treatment option for treating knee osteoarthritis pain for up to 3 years. Few patients in our study progressed to knee arthroplasty suggesting this approach to be a viable alternative to surgery.

Frailty is associated with disability and early mortality and may be reversible. It is accelerated in patients with certain rheumatic musculoskeletal diseases (RMDs). The prevalence of and disease-specific factors associated with frailty across multiple RMDs is unknown.

Data were acquired from FORWARD, The National Databank for Rheumatic Diseases, an observational longitudinal US registry with biannual patient questionnaires. Frailty was measured by self-reported measure: the FRAIL scale, which queries 5 items: 1) fatigue, 2) resistance (climbing stairs), 3) ambulation, 4) illnesses, and 5) loss of weight and categorizes those with ≥3 items as frail. Those with missing RMDs or frailty variables were excluded (N=117). Prevalence of frailty across RMDs was described. Multivariable logistic regression was performed to identify variables independently associated with frailty in the entire cohort and stratified by RMDs.

3,348 individuals were included and 1,084 were frail (32%). RMDs evaluated were rheumatoid arthritis (71%), osteoarthritis (OA) (16%), fibromyalgia (5%), systemic lupus erythematous (SLE) (4%), other connective tissue diseases (CTDs) (2%), spondylarthritis (1%), and vasculitis (1%). Frail participants were older (69.8±10.6) compared to non-frail (66.3±11.7) and had a higher prevalence of obesity (52% vs. 31%). The distribution of frailty was equal across RMDs (~33%) except vasculitis and CTDs, which had a lower prevalence of frailty (20% and 26%, respectively). Ambulation and fatigue were the most common frailty components across RMDs. In the primary multivariable model evaluating the entire cohort, increasing age (OR=1.05 [95%CI 1.04–1.06], female sex (OR=1.74 [95%CI 1.57–1.95]), overweight (OR=1.49 [95%CI 1.17–1.89]) and obesity (OR= 3.04 [95%Cl 2.42–3.82]), prior fracture (OR=1.87 [95%CI 1.56–2.26]), increased disease activity (OR=1.24 [95%CI 1.18–1.30]), and pain (OR=1.11 [95%CI 1.07–1.16]) had significant independent associations with frailty (table 1). Biologic use was associated with lower odds of frailty (OR=0.78 [95%CI 0.64–0.96]). Among the RMDs, SLE was associated with an increased odds of frailty with OA as the reference (OR=1.70 [95%CI 1.02–3.03]). Overall, disease-specific associations were similar to the primary multivariable model with obesity and disease activity maintaining statistical significance in most models.

Multivariable logistic regression evaluating factors associated with frailty in the entire cohort (N=2947)

Frailty is common among RMDs affecting nearly 1 in 3 participants. Obesity, prior fracture and a diagnosis of SLE had the highest associations with frailty. Future work is needed to identify factors that predict frailty onset and potential interventions to treat frailty within RMDs.

1Western University of Health Sciences, Pomona, CA

2University of Southern California, Los Angeles, CA

Pemphigus vulgaris (PV) and IgA Pemphigus are mucocutaneous autoimmune diseases that commonly present as painful blisters eroding the skin of the face, trunk, scalp, groin, and axillae in affected patients. The pathogenesis of pemphigus disease stems from autoantibodies against desmosomal proteins essential to maintaining keratinocyte adhesion. A histopathologic exam may reveal a reduction in desmosomal cadherin proteins and epidermal acantholysis. Currently, there is no cure for pemphigus, though corticosteroids and steroid-sparing agents are commonly used to control the proliferation of lesions and prevent disease progression. Frequently used non-steroidal agents include mycophenolate mofetil, azathioprine, IVIG and rituximab. Despite these treatment options, patients often succumb to long-term corticosteroid complications. Less oft-used therapies include dapsone and sulfasalazine for PV, and colchicine for IgA pemphigus which offer potential steroid-sparing alternatives with fewer adverse effects, however, their efficacies has not been clearly established. The objective of our systematic review is to investigate the use of dapsone, sulfasalazine, and colchicine in the treatment of PV and IgA pemphigus.

We searched the PubMed database using the search terms: ‘dapsone’ ‘sulfasalazine’ ‘pemphigus vulgaris’ ‘colchicine’ ‘IgA pemphigus disease.’ Our inclusion criteria included published articles written in English between 1970–2021 exploring the use of dapsone, colchicine or sulfasalazine for pemphigus, and included case series, retrospective studies, and randomized control trials. Our exclusion criteria eliminated reports with fewer than three patients, and review articles. 275 articles were identified, of which 27 relevant studies were eligible. 15 studies were excluded after screening, resulting in 12 remaining studies.

46 (63%) out of 73 patients responded to dapsone, suggestive of its efficacy as either a monotherapy for or as a part of combination therapy for PV. In 65 patients receiving sulfasalazine adjunct therapy, 61 (94%) achieved clinical remission. Adequate data is lacking regarding colchicine therapy for pemphigus as the current literature only reports four IgA pemphigus patients treated with this agent.

More research is required to elucidate an effective and safe therapy for individuals burdened with pemphigus disease. Certainly, the rarity of this condition and the difficulty in finding adequate control groups present a major barrier for holding clinical trials on alternative therapies. Going forward, dermatologists may consider the use of dapsone or sulfasalazine adjuvant therapy in PV patients to slowly lower corticosteroid use as lesions begin to diminish and to prevent relapse of cutaneous flare ups for patients in remission.

1Western University of Health Sciences College of Osteopathic Medicine of the Pacific, Pomona, CA

2Western University of Health Sciences, Pomona, CA

3Rutgers New Jersey Medical School, Newark, NJ

4Western University of Health Sciences, Pomona, CA

Type 2 Diabetes Mellitus (T2DM) is an inflammatory disease that can alter the immune response resulting in several physiological manifestations. Glutathione (GSH), a thiol required to maintain intracellular redox state homeostasis, is classically deficient among individuals with T2DM. Glutathione also appears to be pertinent in the immune response against Mycobacterium tuberculosis (Mtb) infection. In our previous studies, we have identified L-GSH’s direct opposing effects against oxidative damage as well as its immune enhancing effects in HIV+ patients. We explored for similar effects in T2DM which also involves inflammatory and infectious states that could potentiate the replication of Mtb and further diminish the immune response. Specifically, our study aims to further elucidate GSH’s role in the granulomatous effector response. In this study, we attempted to uncover whether GSH deficiency in diabetic mice (db/db) impairs the formation of granulomas and the granulomatous effector response to further the understanding of the detailed mechanism of Mtb pathogenesis and the potential for novel therapies against the disease brought on by the infection.

Db/db mice were infected with Mtb and treated with one of 3 regimens, either: 1) an optimal dose of rifampicin (RIF), 2) a suboptimal dose of RIF, or 3) a suboptimal dose of RIF in addition to reduced form of GSH encapsulated in liposomes (L-GSH). 3 male and 3 female mice were sacrificed for each group over the span of 3h, 2, 4 wks, 6 wks, and 8 wks post-treatment to study the collective effects of L-GSH and RIF in Mtb infection. Granuloma samples from each group were formalin-fixed and analyzed accordingly. We are currently measuring the survival of Mtb along with the levels of cytokines, free radicals and GSH in untreated, RIF-treated and RIF+GSH-treated db/db mice.

We expect to obtain the data from the aforementioned assays shortly.

If our data shows a marked elevation of immune defensive cytokines and granuloma formation with concurrent reduction in Mtb survival and free radical production in L-GSH treated mice, then we can support our hypothesis that GSH enhances the granulomatous effector response against Mtb infection in T2DM. In addition, if we observe greater immune responses in RIF+GSH treated db/db mice, we may further explore the use of GSH as an adjunct therapy in Mtb infection in T2DM.

1Western University of Health Sciences, Pomona, CA

2Riverside Community Hospital, Riverside, CA

A 53-year-old female with a past medical history of anxiety presented to the dermatology clinic with a pruritic eruption for six weeks. She had been gardening without gloves the day prior to the onset of the eruption. A review of systems was inconclusive. Her examination revealed diffuse red papules coalescing into plaques with mild scales involving the scalp, face, neck, torso, and upper and lower extremities including palms and soles, and sparing the ears, bilateral axillae, elbows, and knees. Her biopsy revealed solar elastosis and abundant multinucleated foreign body giant cells with ingested elastic fibers. The patient’s clinical presentation and histopathology was consistent with a diagnosis of actinic granuloma (AG). Her treatment included 20 mg of prednisone PO QAM for one month along with fluticasone 0.05% face cream BID and triamcinolone 0.1% cream BID applied to the affected skin on the body. After one month, all lesions flattened except for post-inflammatory erythema macules. Sun avoidance and daily sunscreen use was also recommended. At the most recent follow up, her lesions resolved demonstrating the efficacy of corticosteroid treatment.

AG is a rare skin eruption with an unknown pathogenesis, however, it is proposed that a sun-induced inflammatory response attracts giant cells to form granulomas and degrade elastic material. Lesions begin as multiple small pink papules and nodules that coalesce into demarcated, annular plaques and a hypopigmented center to form the classic ring shape. Actinic elastosis surrounds the outer annulus ring, with histiocytes and giant cells within the raised border, and the innermost central zone filled with minimal to absent elastic fibers. Lesions are commonly found on the forehead, neck, extremities, and hands. Our patient differed from the typical presentation in that she described intense pruritus associated with her eruption.

Solar elastosis and abundant multinucleated foreign body giant cells with ingested elastic fibers

Valley Children’s Hospital, Madera, CA

Congenital heart block (CHB) in neonates is associated with high morbidity and mortality. CHB generally occurs due to the presence of maternal autoantibodies of the Ro/La family or cardiac defects.

We describe a neonate born with CHB who was found to have neonatal lupus erythematosus (NLE).

A term female infant was born by cesarean delivery at 37 weeks to a 24-year-old healthy primigravida. At delivery, the baby’s heart rate was 55 beats per minute. The patient was admitted to the neonatal intensive care unit for further evaluation and management of fetal bradycardia.

Electrocardiogram demonstrated third-degree atrioventricular (AV) block and fetal echocardiogram showed a ventricular rate of 60–65 beats per minute and an atrial rate of 116–128 beats per minute. There was good ventricular function without evidence of hydrops.

Physical exam revealed a term, well-appearing infant female with bradycardia but normal S1 and S2 without murmurs. The remainder of her examination was within normal limits. Laboratory evaluation of the infant and the mother showed positive anti-SSA/Ro and anti-SSB/La antibodies.

The baby was diagnosed with NLE and CHB. The infant’s heart rate was monitored closely but she maintained a heart rate greater than 60 beats per minute and hence was discharged home. A pacemaker was scheduled for placement as an outpatient.

NLE is a rare acquired autoimmune disorder that occurs due to passive placental transfer of maternal autoantibodies to SSA/Ro and/or SSB/La. Anti-SSA/Ro autoantibodies are found in about 85–90% of mothers of neonates with CHB, and studies of pregnancies in anti-SSA/Ro positive mothers estimated the risk of CHB to be 1–5%.

Cardiac involvement in NLE is usually irreversible and characterized by second- or third-degree CHB. A ventricular rate of less than 55 beats per minute, hydrops fetalis, or atrioventricular valve regurgitation indicate poor fetal prognosis.

This case emphasizes the importance of considering NLE in infants with fetal bradycardia, congenital AV block or arrhythmias and evaluating the mother and infant for autoantibodies to SSA/Ro and/or SSB/La. We also highlight the need for early referral to cardiology and possible pacemaker implantation in infants who do not respond to medical therapies alone.

Western University of Health Sciences, Pomona, CA

Lymphatic Pump Technique (LPT) is an Osteopathic Manual Medicine technique involving external pressure to various lymphatic structures with the goal of improving lymph drainage. Because of the leukocyte content of lymph, LPT is often indicated for use as an adjuvant therapy in patients with acute or chronic infections. Previous studies have primarily characterized LPT efficacy in terms of clinical or symptomatic outcomes, typically with rather small cohorts of subjects, which has been a criticism of the technique. To address the limitations of past studies, we present a study in which the serum concentration of anti-spike protein COVID-19 antibodies are measured in 100 subjects following treatment with LPT (experimental group) and 100 subjects without LPT treatment (control group) in conjunction with COVID-19 vaccination. The ongoing study is designed to follow the subjects for one year after the first COVID-19 vaccine.

Subjects were split into treatment or control groups in a double-blinded randomized process. Participants returned for blood draws with the following schedule based on the day of their first vaccination: day 0 (1rst vaccine), day 7, day 21 (2nd vaccine), days 28, 35, 90, 182 and, 365. In the treatment branch, LPT was performed on the day of each vaccination and the following day for a total of 4 treatments. Blood draws were performed immediately prior to both vaccine administration and treatment with LPT. Blood samples were processed, and serum biobank is created. Serum anti-spike antibody levels are to be determined using quantitative ELISA. All recruited participants were over the age of 18 and were not vaccinated. The study was approved by the WesternU Institutional Review Board.

Recruitment has been successful and is ongoing. Currently, 96 participants have been recruited. 12 participants have dropped out for various reasons, leaving 84 continuing participants. To address attrition, compensation has been changed from a total of $100 to $200, dispersed as $25 per blood draw. Attrition rate before the compensation change was 30% (6/20) but reduced to 7.9% (6/76) after the change. Total retention rate is 87.5% with 53.6% (45/84) of participants having completed the 5th blood draw. Additionally, the diversity of the participant population is promising with significant representation of LatinX individuals (55.9%) and females (58.3%).

The ongoing study is still recruiting participants, however significant progress has been made, with 84 participants currently on board. Additionally, retention and the diversity of the participant population are both promising. The first set of serum samples is soon to be subjected to ELISA antibody testing and results are forthcoming.

1Fresno Institute of Neuroscience, Fresno, CA

2Providence Holy Cross Medical Center, Mission Hills, CA

3NYU Long Island School of Medicine, Mineola, NY

The SARS-CoV2 virus continues to have devastating consequences worldwide. Though vaccinations have helped to reduce the impact of the virus, new strains still pose a threat to the unvaccinated, and to a lesser extent vaccinated, individuals. Therefore, it is imperative to identify treatments to reduce the severity of Covid-19. Recently, acute use of selective serotonin reuptake inhibitor (SSRI) antidepressants in COVID+ patients has been shown to reduce the severity of symptoms compared to placebo. Since SSRIs are a widely used anti-depressant, the aim of this study was to determine whether COVID+ patients already on SSRI treatment upon admission to the hospital had reduced mortality compared to COVID+ patients not on chronic SSRI treatment.

A retrospective observational study design was used. Electronic medical records of 9,044 patients with a laboratory-confirmed diagnosis of Covid-19 from 03/2020 to 03/2021from six hospitals were queried for demographic information, admission date; discharge date and disposition; length of stay; admission diagnoses; medications on admission; co-morbidities; age; gender; ethnicity; admission to ICU; ventilator use; supplemental oxygen; oxygen saturation; discontinuation of antidepressant medications upon ICU admission.

Using R, a logistic regression model was run with mortality as the outcome and SSRI status as the exposure. An adjusted logistic regression model was run to account for R age category, gender, and race. All tests were considered significant at p of 0.05 or less.

In this sample, no patients admitted on SSRIs had them discontinued. This is consistent with current recommendations.There was no significant difference in the odds of dying between COVID+ patients on chronic SSRIs vs COVID+ patients not taking SSRIs, after controlling for age category, gender, and race. The odds of COVID+ patients on chronic SSRIs dying was 0.90 (95%CI: 0.74, 1.09; n=832) compared to COVID+ patients not on SSRIs (p=0.29; n=8211).

In times of pandemics due to novel infectious agents it is difficult, but critical to evaluate safety and efficacy of drugs that might be repurposed for treatment. This large sample size of 9,044 patients suggests that there will be no significant benefit to use of SSRIs to decrease mortality rates for hospitalized patients with Covid-19 who are not currently on SSRI medications.This study shows the utility of large clinical databases in addressing the urgent issue of determining what commonly prescribed drugs might be useful in treating COVID-19.

1University of Washington School of Medicine, Seattle, WA

2Fred Hutchinson Cancer Research Center, Seattle, WA

Natural and vaccine-induced immunity are important for SARS-CoV-2 control. We evaluated SARS-CoV-2-specific T cell-mediated immune responses in COVID-19 survivors followed through vaccination. We compared T cell tests from Oxford Immunotec (OI) with an in-house laboratory-developed test (LDT). Each used peptides covering Spike (S) and non-vaccine proteins within SARS-CoV-2. We hypothesized that T cell responses to S will increase after mRNA vaccination. We compared vaccine immune boost in persons previously hospitalized vs. non-hospitalized for COVID-19, and the relationship between T cell and neutralizing antibody (nAb) responses.

20 subjects (median age 62.7, 50% female, most White) with PCR-confirmed SARS-CoV-2 infection donated plasma and peripheral blood mononuclear cells (PBMCs). Samples were from a median of 49 days after recovery from COVID-19 (V0), just prior to the 1st vaccination (E01), and 2–4 weeks after each mRNA dose (E02 and E03). T cell responses were measured by interferon-gamma enzyme-linked immunospot assays (ELISPOT). We compared our LDT assay using the S and nucleocapsid (N) proteins with the OI assay including the S, N, and matrix (M) proteins. nAb levels were measured by fluorescence inhibition. Linear regression was used to assess correlation between tests. Wilcoxon matched-pairs signed rank tests were used to evaluate differences in immune responses over time. Mann-Whitney tests were used to examine differences between hospitalized and non-hospitalized groups.

Strong correlation was noted between LDT and OI results for S protein at each time point (rho= 0.88, 0.85, 0.60, and 0.77, respectively). Between V0 and E02 (median of 327 days) and V0 and E03 (median of 345.5 days), there were significant increases in S-specific T cell responses (p= 0.0005 and 0.0006, respectively). No additional boost between E02 and E03 (p= 0.54) was observed. Low level (V0) responses to N and M were not boosted with vaccination. No significant difference in S-specific T cell responses between hospitalized and non-hospitalized groups were noted. For both hospitalized and non-hospitalized persons, nAb levels increased significantly after 1st dose of vaccine (p= <0.0001), with no additional nAb increase after the 2nd dose. No correlation between nAb and S-specific T cell responses at either V0 or E03 was noted.

The OI assay is suitable for assessing T cell responses to SARS-CoV-2 mRNA vaccines. T cell responses to N and M did not boost, as expected. In this cohort, primary infection severity did not impact vaccine responses 9 months later. nAb and T cell response increases were complete after one dose. This indicates that a second dose may not be needed, at least if given 3–4 weeks after the first in persons recovering from COVID-19 in the prior 9 months.

1University of California Los Angeles David Geffen School of Medicine, Los Angeles, CA

2University of California Los Angeles, Los Angeles, CA

Newborn protection from infection is dependent on both neonatal innate immune responses & transplacental transfer of maternal antibodies (abs). Better understanding of maternal SARS-CoV-2 ab responses during labor & delivery (L&D) can help evaluate maternal risks for infection, dynamics of placental transfer, and neonatal vulnerability. Limited data describes how COVID-19 severity shapes maternal & infant ab responses. Stratification by severity can help characterize the protection to the newborn. The purpose of this study is to investigate maternal SARS-CoV-2 ab concentrations during L&D by disease severity, and compare infant ab responses at birth when exposed to varying severity in utero.

This project is part of the prospective observational cohort study COVID19 Outcomes in Mother-Infant Pairs, analyzing mother-infant dyads in the US & Brazil. Serology of 101 pregnant women in Los Angeles (delivery: April 15, 2020-May 28, 2021) were analyzed and confirmed SARS CoV-2 PCR+ during pregnancy. Maternal blood at L&D, cord blood, and infant blood at birth were analyzed by ELISA for IgA, IgG & IgM (anti-spike receptor binding domain).

For 101 women, 72 had matched cord blood & 86 infant specimens. 76% of women produced all 3 anti-SARS-CoV-2 IgG, IgM, and IgA; 93% had at least one positive ab class; 5% had no detectable abs. Infant serum at birth contained only IgG and no IgM or IgA. With increased duration between onset of infection & delivery, maternal IgG levels waned, and conversely, transplacental transfer ratios increased (R2=0.27). Maternal IgG levels increased with disease severity. A significant increase in infant IgG levels was observed in children born to symptomatic mothers vs asymptomatic mothers (p<0.0001). A trend towards more robust ab responses was observed in infants with severe/critical COVID-19 exposure in utero (p=0.07).

Demographics and clinical characteristics of mother-infant dyads infected with SARS-CoV-2 during pregnancy

Our findings demonstrate how altered maternal responses across distinct COVID-19 disease severity categories influence neonatal protection against SARS CoV-2.

1Washington State University Elson S Floyd College of Medicine, Spokane, WA

2University of Washington School of Medicine, Seattle, WA

The purpose of this study is to identify common biomarkers and biosignals in COVID-19 patients with heart failure that are associated with increased risk of in-hospital mortality. COVID-19 is associated with worse outcomes in patients with pre-existing comorbidities, such as heart failure (HF). Biomarkers such as B-type natriuretic peptide, troponin, and interleukin-6 have been elevated in patients with HF and COVID-19 and may provide insight on the severity of disease but may not be collected in all patients. The exact association between patients with prior HF and the biomarkers commonly utilized is limited and should be evaluated further. Our study evaluates biosignals and biomarkers that may be predictive of mortality in COVID-19 patients with history of HF.

All patients included were 18 years of age or older, diagnosis of COVID-19 was confirmed by PCR test or hospital clinical criteria and were hospitalized in the University of Washington (UW) Medicine hospital systems between February 2020 to December 2020. The data was collected as part of a national effort for the American Heart Association COVID-19 CVD Registry. The biosignals that were analyzed include temperature, heart rate, respiratory rate, diastolic blood pressure, and systolic blood pressure. The biomarkers include admission white blood cell (WBC) count, platelets, serum creatinine, AST (u/L), ALT (u/L), and lymphocyte count. Patient data also tracked previous medical history and disposition at discharge. A LASSO multivariate regression model was used to identify the variables most predictive of mortality among patients with heart failure.

The study included 54 of the 393 COVID-19 patients (13.7%) with previous diagnosis of heart failure (46% male, mean age 77). Our model estimates that for each standard deviation unit above average (z-score), patients with previously diagnosed heart failure were 13% more likely to die due to COVID-19 (p = 0.021). Among patients with prior heart failure, each z-score increase for WBC count and serum creatinine increased risk of mortality by 3.5% (p = .043) and 5.7% (p = .046), respectively.

Our data suggests that there may be significance in monitoring WBC count and serum creatinine levels among COVID-19 patients with prior heart failure. The WBC count and serum creatinine have a stronger relationship to mortality in patients with prior heart failure compared to those without heart failure. Immune response may be reduced in heart failure patients which can account for the decreased WBC count, but further studies are needed to elucidate the exact mechanism and relationship. The results of this study may provide a roadmap to triage heart failure patients based on admission lab values in the COVID-19 environment.

1University of California Davis, Davis, CA

2Adventist Health and Rideout, Marysville, CA

The national rate of congenital syphilis (CS) has dramatically increased recently. It remains unknown if the children of the agricultural worker population (AWP) are more susceptible to CS in California. Identifying subpopulations vulnerable to transmitting CS may inform the design of intervention efforts. Thus, this study set out to determine whether CS incidence rates are associated with the female AWP in California.

Data from all 58 California counties were retrospectively obtained from the California Department of Public Health and United States Department of Agriculture regarding CS incidence per 100,000 live births and female AWP from December 2014 and December 2018. Female AWP per county was estimated according to the national proportion of female to male agricultural workers provided by the Department of Agriculture. Data was analyzed using geographical information systems mapping and Pearson’s correlation coefficient (r) tests.

The average statewide CS incidence was 68.2 cases per 100,000 live births in 2018. CS incidence and female AWP were concentrated heavily in California’s agricultural Central Valley, with a few coastal exceptions (figure 1 A-C). CS incidence and female AWP were moderately but significantly correlated (r = 0.343; 95% confidence interval = 0.093–0.552; p < 0.001) (figure 1D).

Geographical distributions of CS incidence rates (A: in 2014, B: in 2018) per 100,000 live births, and agricultural worker populations (C: in 2017). Correlation between CS Incidence (2018) and female AWP (2017) (r= 0.343; 95% CI)= 0.093–0.552; p< 0.001) (D). CS – congenital syphilis

Our findings provide evidence that California counties with a higher incidence of CS tend to be home to a greater number of female agricultural workers than counties with low incidence of CS. Given these findings, this study suggests the urgent need to implement culturally appropriate and enduring prenatal healthcare interventions that prioritize treatment of maternal syphilis and prevention of CS in female AWP.

1Western University of Health Sciences College of Osteopathic Medicine of the Pacific, Pomona, CA

2Western University of Health Sciences College of Osteopathic Medicine of the Pacific-Northwest, Lebanon, OR

3Harvard University T H Chan School of Public Health, Boston, MA

4Kenya Medical Research Institute, Kisumu, Kenya

Schistosomiasis is a neglected tropical disease impacting the health of millions of humans primarily in regions of poverty. Freshwater snails are obligate vector hosts of the flatworm parasites (schistosomes) that cause this disease. Our research goal is to uncover the genetics underlying immunity of snails to schistosomes so that novel control strategies may be developed to prevent human infection. Pathogen recognition receptors (PRRs) are part of the first line of defense against pathogens. They are hypothesized to be under balancing selection due to selection pressure on pathogens to evolve novel epitopes to evade immune recognition and on host receptors to detect pathogens. Thus, PRRs and other immune loci are expected to be among the most diverse regions of the genome. Using whole genome data from an important

African snail vector, Biomphalaria sudanica, we hypothesize that diverse regions of the genome will be enriched with immune related loci, and that we can identify novel PRRs through annotation of these regions.

Five B. sudanica strains (collected: Lake Victoria, Kenya) were sequenced using the PacBio and Illuminapaired-endreads. Mean inter-line diversity was calculatedacross the genomes and segmented into smaller windows (10–100kb). Each window with a mean inter-line diversity value >1%, had up to 1Mb of surrounding nucleotides annotated and transmembrane domains (TMDs) identified using predictive software. To determine if TMD peptides were over-represented in high- diversity regions of the genome, we compared the proportion of TMD peptides against the proportion of those in 30 randomly assigned contig regions.

67 of 6815 windows met our nucleotide diversity threshold of 1% divergence. 421 of 818 immune- suspected peptides were identified to have TMDs, over-represented in regions of high diversity when compared to randomized control regions, supporting our hypothesis. Immune-related genes associated with Schistosoma resistance in other species including PTC2 and GRC were also identified using this bioinformatic approach.

Our findings support the diversity-based approach to identifying PRRs which successfully identified known B. glabrata PRRs and novel PRRs in B. sudanica genomes. Our established list of candidate genes for pathogen recognition will provide a foundation guiding resistance studies, gene knockout and GWAS with Biomphalaria species.

1University of California Irvine, Irvine, CA

2Children’s Hospital of Orange County, Orange, CA

Vitamin D has been shown to be helpful in preventing certain respiratory tract infections, but the link between COVID-19 severity and vitamin D levels remains unclear. The purpose of this study was to clarify the relationship between COVID-19 severity and vitamin D levels through a literature review.

A systematic literature review using PubMed and Google Scholar was conducted. Our inclusion criteria included those studies that a) measured vitamin D levels, b) included both a control and subject group, c) measured the association between COVID-19 severity and vitamin D levels, and d) took into account other potential COVID-19 related comorbidities among controls versus subjects groups. Studies that measured only prevention of COVID-19 infection (positivity rate) or mortality were excluded.

Eight studies satisfied our inclusion criteria (see table 1 below). A majority of studies showed significantly lower vitamin D levels in the more severe COVID-19 subjects compared to control patients with less severe COVID-19. Most studies that assessed COVID-19 outcomes based on vitamin D deficiency (VDD) found VDD to be associated with worse outcomes, including more severe disease and increased mortality. The limitations of this review include inter-study variability in the co-morbidities included in multivariate analysis, variability in the definition of VDD among different studies, and a lack of information on vitamin D supplements and other treatments before infection or during hospitalization in several studies. Finally, a causal relationship could not be assessed because all studies were observational, and information on the vitamin D levels before hospital admission, during healthy state, was not available in a majority of studies.

Studies evaluating the association between vitamin D level and COVID-19 disease severity

There may be an association between lower vitamin D levels and more severe COVID-19 disease. However, larger longitudinal studies that not only measure vitamin D levels pre-COVID-19 disease but also take into account all variables, such as comorbidities and treatments that could affect disease severity, are warranted.

1University of California Irvine School of Medicine, Irvine, CA

2Children’s Hospital of Orange County, Orange, CA

Urinary tract infections (UTIs) are one of the most common bacterial infections in women. Concerns over the effectiveness of antibiotics in preventing recurrent UTIs, due to antibiotic resistance and the adverse effects of antibiotics on healthy microbiota, have raised the necessity to investigate reliable non-antibiotic treatments for preventing recurrent UTIs. It has been proposed that probiotics or lactobacillus may be effective in preventing infections by restoring the normal vaginal flora. The purpose of this study is to investigate the effectiveness of vaginal probiotic suppositories for prevention of recurrent UTIs in adult women.

A systematic literature review was conducted through databases such as PubMed and Google Scholar. Only studies that were published after 1990, and compared use of vaginal probiotics with a control group in adult women with recurrent UTI were included. Studies with follow-up period of <6 months were excluded.

We found 5 studies that fit our inclusion criteria (See table 1). In majority of the studies, the frequency of recurrent UTIs was lower in patients who received probiotic vaginal suppositories when compared to controls. However, there was great variability among the studies with respect to the probiotic formulation as well as treatment dose and frequency. The applications were intermittent and varied from daily to weekly to monthly. The probiotic species used in different studies included L. Rhamnosus, L. Fermentum, and L. Crispatus. The sample sizes were small and did not divide the patients into different categories based on risk factors or co-morbidities. In addition, the bacterial cause of UTI was not mentioned in majority of the studies. Mild side effects were noted in both probiotic and control group, and included increased vaginal discharge, vaginal odor, mild irritation and dysuria.

Studies on the efficacy of intravaginal probiotics in preventing recurrent UTIs

Our review suggests a promising role for use of intermittent vaginal probiotic suppositories for prevention of recurrent UTIs in adult women. Larger prospective studies with longer follow-up period are needed to determine the optimal probiotic dosage and frequency in different groups of patients with recurrent UTI.

1Intermountain Healthcare, Salt Lake City, UT

2University of Utah Health, Salt Lake City, UT

A high nucleated red blood cell (NRBC) count in a neonate at birth has been suggested as a biomarker for fetal hypoxia. However, it is not clear if it indicates when the hypoxia occurred. We aimed to measure the time between administering a high-dose of darbepoetin, simulating the marked rise in erythropoietin that follows a hypoxic event, and the first appearance of NRBC in the blood. Limited observations of this interval, the ‘NRBC emergence-time,’ in human neonates suggest it is greater than 24 hours.

We obtained serial blood counts on ten newborn lambs; five dosed with darbepoetin (10 µg/kg) and five placebo controls, to assess the NRBC emergence-time.

The first appearance of NRBC was at 24 h (mean±SD, 2757±3210 NRBC/µL vs. 0/µL in controls). Peak was 48–72 h (16,758±8434/µL vs. 0/µL in controls), followed by fewer at 96 hours (7823±7114/µL vs. 0/µL in controls). Similarly, reticulocytes peaked at 48–72 h (113,094±3210/µL vs. 10,790±5449/µL in controls), with no changes in platelets or leukocytes.

NRBC/µL (left) and NRBC/100 WBC (right), before (time 0) and at intervals following darbepoetin administration to five term lambs, as well as values in five similarly instrumented control lambs. Values from the darbepoetin recipients are shown by a solid black circle and those from the controls by a solid gray circle.

The NRBC emergence time in newborn lambs is similar to reports from newborn humans. By extrapolation, if a neonate has a high NRBC at birth, the erythropoietic stimulus likely occurred within the interval 24 to 96 hours prior to birth.

1University of Utah Health, Salt Lake City, UT

2Mahidol University Faculty of Medicine Ramathibodi Hospital, Bangkok, Thailand

3Brigham Young University, Provo, UT

Every molecule of heme metabolized to bilirubin releases one molecule of carbon monoxide (CO). On that basis, hemolysis can be detected and quantified by measuring CO in exhaled breath. We constructed reference intervals for end-tidal carbon monoxide (ETCOc) levels of neonates 28–34 weeks gestation to assess the hemolytic rate. New instrumentation allows providers to non-invasively measure ETCOc in preterm neonates with low tidal volumes and breathing rates up to 70bpm. Reference intrevals for term and late preterm neonates exist, but until now none have been created for preterm neonates.

Prospective four-NICU study in Bangkok, Thailand, and Utah, USA. Neonates born between 28–34 weeks and up to 28 days old were eligible. Once informed consent was received, a modified CoSense ETCOc analyzer was used to record results. Data from the CoSense devices were linked to patient charts to obtain demographic information.

Values from days one through 28 were charted and upper (>95th percentile) reference interval limits were calculated. During the entire 28 days, the ETCOc upper reference intervals of babies in Bangkok were higher than those in Utah (p<0.01). No differences were found due to sex, or earliest vs. latest gestation at birth (both p>0.1). Preterm neonates in Bangkok and Utah had higher ETCOc values during the first 48 hours after birth than thereafter (p<0.01).

The dashed line and the grey circles are preterm infants from Bangkok. The solid line and open circles are preterm infants from Utah, USA

Using the reference interval chart we created, the hemolytic rate of preterm infants ≥28 weeks can be assessed. This identification allows us to focus subsequent testing to find the cause of the hemolysis, administer more intensive phototherapy, and to assure consistent in-and out-patient follow-up to those with hemolytic jaundice.

1Intermountain Healthcare, Salt Lake City, UT

2University of Utah Health, Salt Lake City, UT

3University of Utah, Salt Lake City, UT

Using ten years of multihospital data, we identified neonates with ‘severe anemia at birth’, defined by a hemoglobin/hematocrit within the first six hours after birth below the 1st percentile. We determined whether caregivers recognized anemia within the first 24 hours after birth, the probable cause of the anemia, treatment given, and whether review suggested a different cause of anemia than listed in the medical record.

Data from neonates born 2011–2020 were obtained from the Intermountain Healthcare Data Warehouse. We reviewed records of all infants with severe anemia at birth. We then categorized the cause as either; hemorrhage, hemolysis, hypoproduction, a combination of etiologies, or unable to determine.

Of 299,927 live births 344 had severe anemia. In 153 (44.5%) the anemia was unrecognized during the first 24 hours. The lowest hemoglobin/hematocrit values were among those with hemorrhage vs. hemolysis (P<0.013) or vs. hypoproduction (P<0.001). In infants with severe anemia secondary to hemorrhage, abruption/other perinatal event and fetomaternal hemorrhage (FMH) were the most likely etiologies. DIC was a common hemolytic cause of anemia, with 85% of DIC cases coincident with hemorrhagic anemia.

Reference intervals for hemoglobin (A) and hematocrit (B) of neonates on the day of birth by gestational age. The bold line indicates the 1st percentile reference interval, below which we label neonates as having ‘severe anemia at birth’. Dashed lines show the 5th percentile, median, and 95th percentiles.

Severe anemia at birth often went unrecognized on the first day. Earlier recognition may be facilitated by an electronic medical record-associated hemoglobin/hematocrit nomogram, with values <1st percentile clearly identified.

1University of Utah Health, Salt Lake City, UT

2Intermountain Healthcare, Salt Lake City, UT

We previously reported fetomaternal hemorrhage (FMH) in 1/9160 births, and only one neonatal death from FMH among 219,853 births. Recent reports indicate FMH is not uncommon among stillbirths. Consequently, we speculated we were missing cases among early neonatal deaths. We began a new FMH initiative to determine the current incidence.

We analyzed births from 2011 to 2020 where FMH was diagnosed. We also evaluated potential cases among neonates receiving an emergent transfusion just after birth, whose mothers were not tested for FMH.

Mong 297,403 births, 1375 mothers were tested for FMH (1/216 births). Fourteen percent tested positive (1/1599 births). Of those, we found 25 with clinical and laboratory evidence of FMH adversely affecting the neonate. Twenty-one received one or more emergency transfusions on the day of birth; all but two lived. We found 17 others who received an emergency transfusion on the day of birth where FMH was not tested for, but was likely; eight of those died. The 42severe (proven + probable) cases equate to 1/7081 births. We judged that 10 of the 42 had an acute FMH, and in the others it likely had more than a day before birth.

We estimate that we fail to diagnose >40% of our severe FMH cases. Needed improvements include; 1) education to request maternal FMH testing when neonates are born anemic, 2) education on false negative FMH tests, 3) improved FMH communications between neonatology, obstetrics, and blood bank.

University of Utah Health, Salt Lake City, UT

Perinatal anemia is a massive global public health burden with an estimated global prevelance of approximately 40%. Severe anemia increases the risk of maternal mortality and can adversely effect fetal development. Adequate correction of anemia is essential for a healthy pregnancy and infant, but universal screening and monitoring is not the care standard in most LMICs. In lieu of universal screening and treatment, providing access to Iron Folic Acid (IFA) tabs is considered an effective and cost-efficient intervention to prevent and treat anemia of pregnancy. However, despite widespread availability of IFA tabs, anemia prevalence continues to be high and the presence of IFA programs may falsely reassure clinicians that patients taking them have adequate hemoglobin.

The study took place at Mota Fofalia Community Health Center (MF-CHC) in Gujarat, India operated by a public-private partnership. The University of Utah operates an academic global health program in collaboration with MF-CHC and assists the health center in sustainable capacity building in maternal-child health. As part of a community-based antenatal care (ANC) program, we recruited a cohort of pregnant women from the surrounding community to complete a standardized nutrition and health survey and participate in scheduled prenatal visits according to WHO and Indian ANC guidelines which include measurement of vital signs and ANC guideline-based interventions . At each ANC visit, a blood hemoglobin level was drawn and each participant was asked if they are currently taking IFA or Albendozole, an antiparisitic.

A total of 501 women were included in the study. 448 (89%) report taking IFA and 53 (11%) report not taking IFA. The average hemoglobin for those taking IFA was 10.11 g/dL (IQR 9.3–11.1) with an average gestational age at screening of 23.0 weeks while the average of those not taking was 10.41 g/dL (IQR 9.8–11.6) (p=.28) with an average gestational age of 10.9 weeks. In the group taking IFA tablets, 97% were also taking Albendozole while only 21% of mothers not taking IFA tablets were taking Albendozole.

In areas with a high prevalence of anemia, patient compliance with standard IFA antenatal therapy is not a an adequate indicator of intervention. While it appears many mothers begin taking IFA as they become pregnant, the presence of readily available IFA therapy to the community is not sufficient in addressing perinatal anemia.

1LAC+USC Medical Center, USC, Los Angeles, CA

2Keck School of Medicine, USC, Los Angeles, CA

Hemoglobin (Hb) analysis among patients in the Neonatal Intensive Care Unit (NICU) is essential in diagnosing anemia, a common problem among neonates of low birth weight or preterm infants. This same population is especially at risk for iatrogenic anemia, given the need for frequent laboratory analysis while having a small blood volume. Hb measured by way of point-of-care (POC) meters or blood gas analyzers, which each require 1–2 drops of blood per reading, could prevent significant phlebotomy in the NICU as compared to the Complete Blood Count (CBC), which requires 0.6 to 1 milliliter. This study aims to determine the validity of POC and blood gas Hb measurements as compared to Hb values from the CBC.

This is a retrospective study of patients admitted to the LAC+USC Medical Center NICU between January 2020 and April 2021 with paired Hb measurements from the laboratory-run CBC and either POC Hb from HemoCue B 201 or blood gas Hb from Gem Premier 5000. Qualifying data was divided into groups based on time between different blood draws, the first comprising of measurements collected within 12 hours of each other, the second of measurements collected between 12 and 24 hours of each other. POC or blood gas Hb and CBC Hb measurements collected over 24 hours from each other were excluded. T-tests were used for analysis of continuous, normally distributed variables. Regression analysis was performed to determine the relationship between paired Hb measurements. Statistical significance was set at p <0.05.

We identified 250 subjects with qualifying paired Hb values from the CBC and POC Hb or blood gas Hb. There were 488 paired CBC and blood gas Hb samples in the <12 hour group, and 243 paired samples in the 12–24 hour group. There were 479 paired CBC and POC Hb samples in the <12 hour group, and 290 paired samples in the 12–24 hour group (table 1). Correlation coefficient (R) for the CBC-blood gas Hb groups were 0.89 in the <12 hour group and 0.83 in the 12–24 hour group. Correlation coefficient for the CBC-POC Hb groups were 0.87 in the <12 hour group and 0.75 in the 12–24 hour group (figure 1).

Correlation between CBC HB and paired POC/blood gas Hb

There was a strong correlation between paired POC Hb or blood gas Hb and CBC Hb values obtained within 12 hours of one another. Our results show that POC Hb or blood gas Hb should be considered as alternatives for CBC Hb. The patients in the NICU would benefit in prevention of iatrogenic anemia. Prospective studies with age- or weight-based grouping and planned pairing within pre-defined time periods would be beneficial in determining whether the correlation persists between POC and CBC Hb measurements in the NICU

LAC+USC Medical Center, Los Angeles, CA

Measuring bilirubin levels in infants admitted to neonatal intensive care unit is done to avoid hyperbilirubinemia and bilirubin toxicity. Practitioners strive to minimize blood tests to reduce patient discomfort and iatrogenic anemia. To minimize blood draws in the monitoring of bilirubin levels, we assessed the accuracy of alternative methods of measurement via blood gas analyzers and transcutaneous bilirubin monitoring.

Using serum bilirubin as a gold standard, we analyzed the accuracy of simultaneous measurements from blood gas analyzer and transcutaneous monitoring. The accuracy of blood gas and transcutaneous bilirubin measurements was ascertained with correlation coefficient and by calculating mean differences between the serum bilirubin levels and the two alternative methods.

Study consisted of 86 patients with gestational ages 24 to 41 weeks. The correlation coefficient for serum bilirubin vs transcutaneous measurements was r = 0.893 (p<0.00001). The correlation coefficient was r = 0.9283 (p<0.00001) for preterm infants, and r = 0.8392 (p<0.000013) for term infants. The mean difference between serum bilirubin vs transcutaneous measurements was 0.45 with a standard deviation of 1.55 mg/dL. The correlation coefficient for serum bilirubin vs blood gas bilirubin was r = 0.959 (p<0.00001). The correlation coefficient was r = 0.9291 (p<0.00001) for preterm infants, and r = 0.9742 (p<0.00001) for term infants. The mean difference between serum bilirubin vs blood gas bilirubin was 0.21 with a standard deviation of 0.87 mg/dL.

Both transcutaneous and blood gas analyzer bilirubin levels had a strong correlation with serum levels, with bilirubin using blood gas analyzer being slightly more accurate. We plan to continue collecting bilirubin data for a total of 6 months. If accuracy of transcutaneous or blood gas analyzer bilirubin measurements are acceptable, we plan to pursue these alternative methods of bilirubin measurements over the following 6 months and assess to what extent we were able to minimize blood draws.

Los Angeles County University of Southern California Medical Center, Los Angeles, CA

Very low birth weight (VLBW) infants with prolonged respiratory morbidities such as bronchopulmonary dysplasia (BPD) may need to be discharged home while receiving oxygen therapy. The risk factors causing prolonged respiratory support vary for these infants. The study was performed to characterize factors associated with VLBW infants who require oxygen therapy at discharge.

Data on all VLBW infants was gathered from the electronic medical record between the years of 2009 and 2021 retrospectively with IRB approval obtained prior. Oxygen therapy was statistically analyzed using SPSS Version 28 statistical software against early neonatal outcomes such as ventilation, intubation, chest compressions, or surfactant, or against common neonatal morbidities, including IVH, BPD, ROP, and patent ductus arteriosus (PDA).

Of 560 VLBW infants, 144 (25.7%) were discharged home on oxygen. Significant maternal risk factors included histologic chorioamnionitis, intubation, and chest compressions at delivery. These infants had lower gestational age as well as lower birth weight. These infants were more likely to require surfactant, invasive ventilation at 24 hours, and receive additional ventilatory support including high-frequency oscillatory ventilation or jet ventilation (table 1). Infants who were discharged with home oxygen therapy were significantly more likely to have BPD and hemodynamically significant patent ductus arteriosus (PDA) requiring surgery. Associated comorbidities included retinopathy of prematurity (ROP) requiring treatment, severe intraventricular hemorrhage (IVH), and increased length of stay in NICU (table 1). Regression analysis revealed lower birth weight, longer duration of invasive ventilation, PDA requiring surgical intervention, and BPD to be the most significant predictors.

Home oxygen therapy requirement against common neonatal outcomes

The need for high-frequency ventilation such as jet and oscillatory ventilation, as well as chest compressions, is associated with home oxygen need at discharge. Additionally, VLBW infants who required home oxygen therapy were more likely to have needed invasive ventilation, delivery room intubation, and surfactant therapy. They were also more likely to have common neonatal morbidities such as BPD, severe IVH, severe ROP, and PDA.

1University of Washington School of Medicine, Seattle, WA

2Fred Hutchinson Cancer Research Center, Seattle, WA

Breast milk is an important contributor of the neonatal microbiome. Studies have associated breastfeeding with a decreased risk of acquiring inflammatory bowel disorders later in life. As breastfeeding is not possible for all mothers and children, gaining a mechanistic understanding of this process can lead to the development of early-life interventions that foster beneficial host-microbiota relationships.

We have previously shown that breast milk antibodies are important for maintaining mucosal homeostasis. Mice deficient in breast milk antibodies exhibit perturbations in mucosal immunity, including elevated T follicular helper (Tfh) cell and germinal center (GC) B cell responses in the gut-associated lymphoid tissues. We hypothesize that the Tfh and GC B cell response generated by the neonate in the absence of breast milk antibodies target resident mucosal bacteria and lead to long-term alterations in the abundance of gut-microbiota.

We extracted DNA from intestinal-wall associated microbes and quantified the abundance using quantitative-PCR of the bacterial 16s rRNA. We looked at both the small intestinal wall microbes and large intestinal wall microbes at ages 3, 5 and 11 weeks in maternal antibody sufficient or deficient pups. To explore the effects of Tfh cells and GC B cells, we treated half of each group with anti-Inducible T-cell Co-Stimulator Ligand (anti-ICOSL) antibody, which blocks Tfh and GC B cell formation. We used a paired T-test to determine the significance of our results.

We found no significant difference in the abundance of wall-associated microbes across all four groups in the small or large intestine. These data suggest that the breast milk antibody dependent response as well as the Tfh-cell dependent response do not alter the abundance of wall-associated microbes.

Studies are ongoing to determine if breast milk antibodies affect intestinal microbe composition. Limitations to this study may include the drinking water our mice were exposed to throughout the duration of this experiment. Due to mouse husbandry standards operations within our institution our mice were given acidified water which we have recently discovered to significantly alter their microbial composition. Our goal is to repeat this experiment with non-acidified water as we suspect any potential differences in the microbial abundance may have been masked by the effects of acidified water.

1Loma Linda University School of Medicine, Loma Linda, CA

2Loma Linda University Department of Basic Sciences, Loma Linda, CA

Gestational long-term hypoxia (gLTH) is a significant stressor that leads to multiple diseases including pulmonary hypertension. Evidence indicates that gLTH causes oxidative stress and inflammation, which changes cell structure and function. These effects are driven by changes in cellular metabolism, protein expression, and transcriptional regulation. Our proteomic data show that gLTH leads to vascular remodeling and specifically to reduction of collagen 1A1, 1A2, and 3A1, though the data do not delineate where in the arterial wall these changes are occurring. We hypothesized that gLTH causes loss of collagen in all arterial layers which was tested by visualizing and quantifying the collagen content in different layers of fetal pulmonary arteries.

Fetal sheep pulmonary arteries from normoxic and gLTH environments were obtained and stained with picrosirius red dye (PSR) to visualize collagen in captured images of arterial biopsies by assessing the optical density (OD) of birefringence from polarized light. Fluorescence microscopy was used to capture images of the arterial samples. Data from Image J analysis of OD birefringence, inversely related to crosslinked collagen, of the various vascular layers and treatments were evaluated by analysis of variance.

The image analysis showed a significant decrease in optical density, and therefore enhanced crosslinking, in the adventitia compared to the media for all samples belonging to either normoxic or gLTH groups. However, there was no significant difference in optical density of the adventitial versus medial layers between vessels from normoxic and gLTH fetuses.

The results indicate that optical density quantification can be used to detect substantial differences in collagen and crosslinked structure between the medial and adventitial layers. The findings also raise the possibility that neither collagen nor its crosslinked structure may be affected by gLTH. The data provides evidence that this technique needs refinement to properly visualize the locations where modest changes in expression may occur. Secondarily, the inability of PSR stain to distinguish among certain types of collagen subtypes leaves open the possibility that a shift in the type of collagen may affect biomechanical processes that are associated with pulmonary vascular development or gLTH. The PSR red staining method may not have been definitive, but this study is an important steppingstone towards developing an experimental strategy of visualizing modifications in vascular collagen isoform expression that complement contemporary analytical quantification techniques that provide unique insight into vascular structure and function.

1The University of Utah School of Medicine, Salt Lake City, UT

2University of California Davis, Davis, CA

3University of California Davis Health System, Sacramento, CA

4Harvard Medical School, Boston, MA

Bronchopulmonary dysplasia (BPD) is histopathologically characterized alveolar simplification in preterm infants who are chronically mechanically ventilated. Mesenchymal stromal cell extracellular vesicles (MEx) treatment improved alveolar formation in mouse neonatal hyperoxia models of BPD. We tested the hypothesis that MEx will improve alveolar formation in chronically mechanically ventilated preterm lambs.

Preterm lambs (128d; term ~150d; ~28w human gestation) were exposed to antenatal steroids, perinatal surfactant, and resuscitated and supported by mechanical ventilation for 6–7d (Drager VN500, SIMV). Physiological targets were PaO260–90 mmHg, PaCO245–60 mmHg, O2 saturation 88–92%, pH 7.25–7.35. One group was treated with MEx (60 x 106 cell equivalents; 10 mL; n=8; 4F 4M) at hours of life 6 and 78 (iv); the control group received vehicle (MEx diluent in saline; 10 mL; n=8; 4F 4M). We used morphometry and stereology to quantify structural indices of alveolar formation, and immunoblot to quantify apoptosis (cleaved caspase 3) and proliferation (PCNA).

Radial alveolar count and secondary septal volume density were significantly greater (* p<0.05) in the MEx-treated group compared to the control group (figure 1A and B). Distal airspace wall thickness was significantly narrower in the MEx-treated group compared to the control group (figure 1C). Normalized cleaved caspase 3 protein abundance was not different between the MEx-treated and control groups (0.71±0.05 vs 0.69±0.04, respectively). Normalized PCNA protein abundance was significantly lower in the MEx-treated group versus the control group (0.43±0.05 vs 0.55±0.05, respectively). No differences were detected between females and males.

Quantitative morphological results show that alveolar formation is signficantly better (* p

We conclude that MEx improved alveolar formation in chronically mechanically ventilated preterm lambs. We speculate that MEx may be an effective therapy to promote normal structural development of the lung in preterm infants who require mechanical ventilation and are at-risk of developing BPD.

1Los Angeles County University of Southern California Medical Center, Los Angeles, CA

2University of Southern California Keck School of Medicine, Los Angeles, CA

In adult lung, alveolar type II cells (AT2s) serve as facultative stem cells. They proliferate in response to injuries to regenerate and repair the alveoli. There is lack of information on whether AT2s in immature lungs undergoing alveologenesis, such as those of preterm neonates may act as resident stem cells and undergo proliferation in response to injuries that cause BPD. Genetic inactivation of both TGFβ receptors in secondary crest myofibroblasts (SCMF) arrested alveologenesis causing a BPD phenocopy. Alveolar arrest was accompanied by decreased number of SCMF and AT2s, thus suggesting cross-communication between the two cell types during alveologenesis. To determine the mechanism, we quantified AT2 cell numbers in control and mutant lungs at postnatal days 7 and 14 (PN7 and PN14) during alveologenesis.

A total of 12 mouse lungs, (control, n=6, and mutant, n=6) were examined at PN7 and PN14 (n=3 for each control and mutant.) Immuno-histochemistry and immuno-fluorescence were performed on multiple samples of lung tissues. AT2s were identified as SPC positive cells. Proliferating AT2s (pAT2s) were identified as SPC; KI67 double positive cell. To correct for hypoplasia in BPD samples, all results were normalized against total lung cells, identified as DAPI positive.

In PN7 lungs, the ratio of AT2s/total cells (SPC+/DAPI+) was higher in mutant vs control (AT2s: 10.77% vs 8.58, respectively) likely due to reduction in DAPI+ cells that included reduced SCMF. The pAT2s remained unchanged (pAT2: mutant 0.47% vs control 0.54%). In contrast, in PN14 lungs, both total AT2s, and pAT2s decreased in the mutant lungs vs control (total AT2s: 8.62% vs 10.54%, respectively) and (pAT2: 0.50% vs 0.93%, respectively) indicating that proliferation of both SCMF and AT2s has decreased.

In mutant lungs, TGFβ receptors inactivation decreases SCMF numbers, while AT2s are unaffected in early phases of BPD-like pathogenesis. As the phenotype and loss of SCMF become more established and widespread, inhibition of AT2 proliferation becomes measurable. Two conclusions are derived from these observations: 1) targeted SCMF have a regulatory impact on AT2 proliferation, which is a regenerative response to injury. 2) Despite important differences between mouse phenocopy and human BPD, similar dynamics may occur in the lungs of preterm infants who develop BPD. Response of the endogenous stem cells (i.e. AT2s) in the lung to initial injuries may be governed by an orthologous mesenchymal cell type (SCMF-like) and their communications with AT2 stem cells.

Supported by NHLBI, NIH and the Hastings Foundation

1The University of Utah School of Medicine, Salt Lake City, UT

2University of California Davis, Davis, CA

3Harvard Medical School, Boston, MA

Mesenchymal stromal cell extracellular vesicle (MEx) treatment has therapeutic efficacy in murine neonatal hyperoxia models of bronchopulmonary dysplasia (BPD). Whether MEx will be beneficial in chronically ventilated preterm neonates is unknown. We tested the hypothesis that MEx will improve respiratory system physiological outcomes in chronically mechanically ventilated preterm lambs.

Preterm lambs (128d; term ~150d; ~28w human gestation) were exposed to antenatal steroids, perinatal surfactant, and resuscitated and supported by mechanical ventilation for 6–7d (Drager VN500, SIMV). Physiological targets were PaO260–90 mmHg, PaCO245–60 mmHg, O2 saturation 88–92%, pH 7.25–7.35. One group was treated with MEx (60 x 106 cell equivalents; 10 mL; n=8; 4F 4M) at hours of life 6 and 78 (iv); the control group received vehicle (MEx diluent in sterile saline; 10 mL; n=8; 4F 4M). We report daily physiological outcomes for respiratory severity score (RSS), oxygenation index (OI), Arterial-alveolar (A-a) gradient, and oxygen saturation/FiO2 (S/F) ratio. Liver and kidney function tests were assessed.

MEx-treated preterm lambs were ~1d younger (* p<0.05) and weighed less (*) at delivery than control lambs (figure 1A and B). MEx-treated lambs tolerated enteral feeding and maintained weight (*) whereas control lambs were less tolerant of enteral feedings and lost weight over 7d (figure 1B). RSS, OI, and A-a gradient were lower for MEx-treated group (*) compared to the control group. S/F ratio was higher for the MEx-treated group (*) compared to the control group. Neither liver nor kidney toxicity was detected. Differences were detected between females and males.

Respiratory system physiological outcomes for preterm lambs treated with MEx (black circles) or vehicle (white circles)

We conclude that MEx improved respiratory system physiological outcomes in chronically mechanically ventilated preterm lambs. We speculate that MEx may be an effective therapy for appropriate functional development of the lung in preterm infants who require mechanical ventilation.

1University of California San Diego, La Jolla, CA

2The University of Texas Health Science Center at San Antonio, San Antonio, TX

Oxidized phospholipids (OxPL) are formed during inflammatory processes, and they are known to induce cellular stress and apoptosis. The role OxPL play in lung inflammation is not known. OxPL are recognized by the IgM natural antibody (Ab) E06, which can bind to and block many of the pro-inflammatory properties of OxPL.To investigate the role of OxPL in hyperoxia-induced acute lung injury (HALI) and whether neutralizing OxPL using E06 would ameliorate or prevent hyperoxia induced lung injury.

C57BL/6J (B6) sensitive and DBA/2J (DBA) resistant mice were exposed to hyperoxia for 48h to induce lung injury. We examined the content of OxPL by immunochemistry with E06 and examined inflammatory responses by measuring changes in immune cell composition in the lung by fluorescence-activated flow cytometry and by immunohistochemistry. We measured gene expression changes in whole lung by RNA-seq. Data were analyzed with FlowJo and HOMER. To examine the pathogenic role of OxPL, we also exposed E06-scFv transgenic mice to hyperoxia. These mice generate a high plasma level of functional E06-scFv (single-chain variable fragment of E06).

Using immunohistochemistry, we observed an accumulation of OxPL in the lungs of sensitive B6 mice after hyperoxia. OxPL were more abundant in lungs of B6 mice compared to resistant DBA mice. To further explore the molecular determinants of interstrain susceptibility to oxygen, we performed transcriptomic analysis of the whole lung. Transcripts that most distinguished B6 from DBA mice were associated with apoptotic and cell death pathways. To test whether OxPL have a pathogenic role, we exposed E06-scFv mice-on B6 background-to same hyperoxic conditions. Unlike B6 mice, E06-scFv mice did not show activation of apoptosis and cell death related gene pathways.

We observed significant increases in OxPL accumulation following acute hyperoxia exposure in the lungs of injury sensitive compared to resistant mice. OxPL accumulation in the lungs of B6 mice was associated with upregulation of apoptosis and cell death related genes. Blocking of OxPL by the secreted E06-scFv Ab resulted in a significant reduction of apoptosis protecting the lung from HALI. These data suggest that OxPL are not only a useful biomarker for hyperoxia induced lung injury but that an OxPL neutralizing antibody could be used to ameliorate or prevent HALI. Furthermore, the magnitude of interstrain variability in lung gene expression could form the basis for understanding human interindividual variability in susceptibility to oxygen induced injury.

1University of California Davis, Sacramento, CA

2University of California Davis, Davis, CA

Gas exchange is severely impaired during cardiopulmonary resuscitation (CPR) in the cardiac arrested lamb model despite ventilation with 100% O2. Optimizing gas exchange during neonatal CPR may improve cerebral oxygen delivery (cDO2), prevent rapid fluctuations in PaCO2, and stabilize cerebral blood flow. We hypothesize that asynchronous continuous chest compressions with high frequency percussive ventilation (HFPV) in preterm asphyxiated cardiac arrested lambs will result in improved gas exchange and cDO2compared to 3:1 compression-to-ventilation (C:V) resuscitation.

Time-dated preterm (~125d gestation; equivalent human ~25 weeks) fetal lambs were intubated, instrumented to measure cerebral blood flow and arterial blood pressure, and catheterized to collect venous and arterial blood. After instrumentation, lambs were asphyxiated by umbilical cord occlusion until asystole and delivered. Lambs were randomized to (1) 3:1 C:V resuscitation using a T-piece resuscitator following the neonatal resuscitation program (NRP) algorithm (control), or (2) asynchronous continuous chest compressions (120 compressions/min) with HFPV using a TXP-5 ventilator (intervention). First dose of epinephrine (0.03 mg/kg) was given at three minutes and repeated q3min until return of spontaneous circulation (ROSC). Lambs in the control group that achieved ROSC were managed on conventional ventilation and lambs in the intervention group were maintained on HFPV. Ventilation parameters and O2 were adjusted to maintain SpO2 at 90–95% and PaCO2 between 45–60 mmHg.

Eight lambs were studied and all achieved ROSC. Baseline characteristics, time to ROSC, and epinephrine doses were similar between groups (figure 1). Mean (SD) PaCO2 was 158 (24) mmHg and the mean (SD) PaO2 was 47 (42) mmHg 15 minutes post-ROSC despite maximum ventilation support and 100% O2 in the control group compared to a mean (SD) PaCO2 of 50 (11) mmHg and a PaO2 of 60 (24) mmHg in the intervention group (table 1).

Comparison of blood gases at fixed timepoints

Comparison of baseline characteristics between groups. There was no significant difference of characteristics between groups (p > 0.05)

Resuscitation using asynchronous continuous chest compressions during HFVP is feasible with similar success of ROSC and improved gas exchange in an asphyxiated cardiac arrested neonatal lamb model. Further studies are required to validate our results and to assess lung injury by immunohistochemistry and biomarkers.

University of Colorado Denver School of Medicine, Aurora, CO

Continuous positive airway pressure (CPAP) is an increasingly common method of non-invasive respiratory support for premature infants to avoid more invasive and potentially injurious ventilation strategies. However, the long-term effects of CPAP on the developing lung are poorly understood. Therefore, we seek to understand the effect of daily CPAP on the structure and mechanical function of the developing lung in neonatal rats.

Control dams were kept in room air and allowed to give birth spontaneously. At day 1, pups were divided into three groups: pups that were secured to CPAP device and receiving daily positive pressure of 6 cm H2O (CPAP-6); pups that were secured to CPAP device and receiving daily airflow but no pressure (CTL-0); and pups that were simply removed from their cage without being secured to the CPAP device for the duration of daily experiments (CTL). Daily CPAP lasted for 2 hours on days 1–2, and 3 hours on days 3–13. At day 14, we measured lung mechanics by flexiVent (total respiratory system resistance (Rrs) and compliance (Crs)). Lung structure was determined by mean linear intercepts (MLI), radial alveolar counts (RAC), and pulmonary vessel density (PVD).

There were no differences in body weights between groups. CPAP-6 rats demonstrated decreased Rrs (p<0.05) compared to CTL and CTL-0, and increased Crs (p<0.01) when compared to CTL. Lungs from CPAP-6 rats showed impaired alveolarization compared to CTL as assessed by decreased RAC (p<0.05) and increased MLI (p<0.001). Pulmonary vessel density was reduced in CPAP-6 vs CTL rats (p<0.01). There were no significant differences in lung structure between CPAP-0 rats with CTL or CPAP-6 rats.

We found that daily CPAP decreased alveolar and vascular growth and altered lung mechanics in infant rats. We speculate that although less invasive than other ventilation strategies, non-invasive positive pressure respiratory support can potentially have negative effects on the normal developing lung, but its net benefits or harm in the setting of lung disease remains uncertain.

Loma Linda University School of Medicine, Loma Linda, CA

Hemoglobin A1c (HgbA1c) is a marker of an individual’s glycemic exposure over a preceding 2–3 month period. Minimal evidence currently exists to support increased infection risk following mastectomy. We aimed to evaluate the association of HgbA1c with the incidence of surgical site infection (SSI) in patients undergoing mastectomy and immediate breast reconstruction.

An institutional database was queried for patients with CPT code for reconstruction AND diagnosis code for breast malignancy for patients from January 1, 2014 to June 20, 2021. We defined SSI incidence by diagnosis or procedure for SSI within 90 days following mastectomy. A one sample t-test was performed to determine if there is a significance difference in the average HgbA1c of the standard patient population and the sample SSI group. A chi-square test was used to analyze data for correlations between SSI rates in diabetics and non-diabetics. The patients were analyzed using a standard chi-square based on a 2x2 contingency table.

A total of 1386 patients were included in the query; 268 of which having received a pre-op HgbA1c and with 136 having received a pre-op HgbA1c and a DM diagnosis. Only 22 patients fit our defined SSI sample group criteria. The average population HgbA1c was 6.74 (N=268), the average HgbA1c of diabetics was 7.33 (N=136), the average HgbA1c of non-diabetics was 6.13 (N=132). The average sample HgbA1c was 6.94 (N=22), the average HgbA1c of diabetics was 7.34 (N=17), the average HgbA1c of non-diabetics was 5.58 (N=5). The one sample t-test of the average HgbA1c value in patients with DM vs nondiabetics in the sample group was not significant, t(21) = -4.210, p = 3.94E-4. The chi-square test revealed diabetics more likely than non-diabetics to develop SSI following mastectomy, X2 (1, N = 1254) = 76.43, p < .001.

Characteristics of all patients and patients in the surgical site infection sample group

The association of HgbA1c with the incidence of surgical site infection (SSI) following undergoing mastectomy is evident. We found the presence of a DM diagnosis as a better prognostic tool for SSI than HgbA1c level alone. Chi-square analysis determined the relative risk of SSI following mastectomy in diabetic patients at 2.28x that of a nondiabetic. Considering the SSI sample group, a HgbA1c threshold of 7.34 is what we propose as presenting great additional post-surgical complication risk.

University of California San Diego, University of California San Diego, La Jolla, CA, San Diego, CA

To elucidate changes due to COVID-19 on patient demographics, surgical care, logistics, and patient outcomes in spine patients.

This is a retrospective study of patients who had spine surgery at UCSD from 3/1/19 to 5/31/19 (pre-COVID-19) and 3/1/20 to 5/31/20 (first COVID-19 surge). 331 subjects met the study criteria. Demographic and surgical data were collected from medical records. Pain levels at pre-operative, discharge, short- (3–6 month) and long-term(9–15 month) timepoints were extracted.

There were no significant differences in patient demographics including age, BMI, gender, race, ethnicity, ASA rating, smoking status, or diabetes status between groups(p>0.14). The diagnostic indications for surgery (spondylolisthesis, tumor/infection, spondylosis, fracture) were not different between groups (p>0.13). There were no differences in operating room duration and skin-to-skin time (p>0.64), however length of stay was 4.7 days shorter during the COVID-19 pandemic (p=0.03) and more cases were classified as ‘urgent’ (p=0.04). Preoperative pain scores did not differ between groups (p=0.58), however pain levels at discharge were significantly higher in patients operated upon during COVID (p=0.04) and trended towards remaining higher in the short- (p=0.06) but not long-term (p=0.21) after surgery (table 1).

The pandemic resulted in a greater proportion of ‘urgent’ spine surgery cases and shorter hospital length of stay. Pain levels upon discharge and at short-term timepoints were higher following surgery, however these differences did not persist in the long term.

1Loma Linda University School of Medicine, Loma Linda, CA

2Loma Linda University, Loma Linda, CA

The objective of this study is to assess the value of performing secondary hand surgery in a population of complex hand procedures at a quaternary referral site academic medical center.

This was done by obtaining electronic medical records (EMR) of 166 patients over a five year period using specific keywords relating to the types of surgeries generally performed as secondary procedures such as tenolysis, contracture release, and capsulotomies. Of those patients, 50 were found to fit within the parameters of the study. For these 50 remaining patients, hand therapy data was obtained and the percentage of motion that each patient had before the secondary surgery and following the secondary surgery were calculated and this data was used to calculate the overall change in motion. A paired sample t-test was performed to determine if there is a significant difference in the average motion data in the measurements taken before and after secondary hand surgeries. One sample t-tests were performed to determine if there is a significance difference in the average change in the range of motion between common comorbidities (asthma, diabetes mellitus, hypertension, obesity, age over 45 years) or gender between the patient population and each subgroup.

During a median follow-up period of 8.5 months and an average follow-up period of 18.87 months, a total of 75 complications in 50 patients were recorded. The average percentage of full motion before surgery was 49.43% and the average percentage of full motion after surgery was 66% to give an overall change in motion to be +16.58%. Hand motion measurements following secondary hand surgery (M = 0.491, SD = 0.188) compared to the hand motion measurements preceding secondary hand surgery (M = 0.660, SD = 0.259) demonstrated a significantly better change in motion percent change, t(50) = 24.50, p = 4.177E-5.

Although we speculate that we would have seen an even greater impact from secondary hand surgeries if there had been greater adherence to post-surgical hand therapy, our p value indicates results that are statistically significant. Therefore, we conclude that secondary hand surgery performed on patients with complex hand injuries has a significant measurable impact and we believe that a similar study in a larger population would yield similar results.

University of California Davis, Sacramento, CA

Malnutrition is associated with increased morbidity and mortality in patients with head and neck cancer (HNC) undergoing surgery. Despite the profound impact malnutrition has on this patient population, objective screening tools are still lacking in a clinical setting. Without a clear approach to identify malnutrition, there is currently a barrier to capturing patients with inadequate nutrition, delaying interventions that could otherwise be implemented to optimize their nutritional status. Therefore, recognizing the need for a tool, the aim of this study is to assess the ability to use the geriatric nutrition risk index (GNRI) to screen for malnutrition among HNC patients and determine if there is an association between GNRI scores and postoperative complications.

A retrospective review of medical records was conducted for patients undergoing surgical resection at a tertiary academic hospital from June 2012 to June 2021. Patients were included if surgical excision was the primary treatment modality and if a serum albumin was obtained 6 months prior to their surgery. A total of 44 HNC patients were included in the study and analysis. Preoperative body weight and serum albumin were abstracted from medical records to calculate the GNRI.

Of the 44 patients included in the study, there were 30 men (68%) and 14 women (32%), with a total mean age of 62 ±12 years. Malnutrition was defined by a GNRI score of <97.5 and was present in 27% of patients (n=12). Malnourished patients had significantly higher rates of postoperative complications and required discharge to a skilled nursing facility (SNF) more often compared to the control group.

Complications in patients with low and high GNRI scores

A low GNRI score appears to be a predictor of increased complications after head and neck surgery. The GNRI is a simple tool that utilizes serum albumin and body weight to objectively assess nutritional status. Results from this study suggest that, in the future, the GNRI may be a clinically useful approach to screen for malnutrition and identify patients who are at high risk for complications during the postoperative course.

1University of California Davis, Sacramento, CA

2University of California Davis Health System Department of Orthopaedic Surgery, Sacramento, CA

3The University of Texas Health Science Center at Houston, Houston, TX

The incidence of periprosthetic distal femur fractures is increasing due to the increasing number of knee arthroplasties being performed in the aging population. The purpose of this study was to analyze the demographics, fracture characteristics, and treatment strategies associated with periprosthetic distal femur fractures (PDFF) compared to native distal femur fractures (NDFF) in order to identify important clinical differences between these groups that might help guide management.

A retrospective study was conducted of 209 patients >18 years old who underwent surgical treatment for either a native distal femur fracture (NDFF) or a periprosthetic distal femur fracture (PDFF) about a total knee arthroplasty (TKA) from January, 2006 to December, 2020. Fracture classification of CT images by the Association for Osteosynthesis/Orthopedic Trauma Association (AO/OTA) was reported. Demographics, fracture characteristics, fixation constructs, and surgical outcomes were compared between subjects with PDFF vs. NDFF.

Out of 70 patients with PDFF, 81.1% were female and 18.6% were male, with an average age of 80 years old (range= 49–102 yrs). PDFFs were most often isolated (80%) or comminuted (85%) injuries with AO classification 33A.3 (71.4%). Out of 139 patients with NDFF, 53.2% were female and 46.8% were male with an average age of 57 years old (range =18–96 yrs). NDFFs were commonly comminuted (92.1%) injuries with AO classification 33C.2 (28.1%) or 33A.3 (25.2%). NDFFs were extra-articular (54.0%) or intra-articular (46.0%). Nearly half of subjects with NDFF (48.2%) experienced concomitant fracture of the ipsilateral knee (14.4%) or tibial plateau (15.1%). Intramedullary nailing was the most common fixation construct for both fracture groups (42.6% PDFF;36.7% NDFF). The second most common fixation construct for PDFF was combined nail/plate (17.3%) and lateral locking plate (20.9%) for NDFF. Patients with PDFF experienced shorter length-of-stays (6.36 days vs. 11.4 days) but had higher complication rates compared to NDFF (5.7% vs 4.4%). Incidence of low bone density (osteopenia or osteoporosis) was higher in those with PDFF compared to NDFF (55.7% vs. 19.4%).

PDFFs frequently occur as isolated comminuted injuries with greater complication rates compared to NDFF. Though intramedullary nailing remains the most common fixation construct for both NDFF and PDFF, stabilization via combined plate/nail is increasingly being used for PDFFs. Elderly women with TKA and poor bone quality are a high risk group for PDFF. Further research should entail how physicians can improve their surgical and clinical approach for this type of fracture in the affected population.

Washington State University Elson S Floyd College of Medicine, Pullman, WA

The use of antithrombotic medications in patients is widely utilized for numerous medical conditions. Research has demonstrated that antiplatelet and anticoagulant use can influence surgical outcomes as well as prolong intraoperative times. However, sparse literature exists examining the effects of antithrombotic use on dermatologic surgery outcomes, specifically in Mohs and cosmetic flap procedures. The purpose of this study is to elucidate the relationship of anticoagulant and antiplatelet therapy on intraoperative time and closure size in dermatologic and cosmetic surgery patients. We hypothesize that those who use daily antithrombotics will have resultantly longer intraoperative times and larger closure sizes in dermatologic flap procedures.

A retrospective medical record review was conducted of all patients who underwent Mohs or cosmetic flap surgery at Chesnut Institute of Cosmetic & Reconstructive Surgery in Spokane, Washington between March 5, 2019 and December 14, 2020. Procedures of 40 minutes duration or less were included. This yielded a total of 243 surgeries with complete information about intraoperative outcomes. Patients were stratified into 5 cohorts based on medication usage (table 1) with documentation of skin closure size and total procedure length.

A statistically significant reduction in intraoperative times (p-value 0.03) was observed in patients who were not taking any form of antithrombotic medication (cohort 5), as compared to patients in cohort 4 who were actively taking anticoagulant medications. Other classes of antithrombotic medication (cohorts 1–3) were associated with higher average intraoperative times relative to cohort 5, however the difference was not statistically significant. There was no statistically significant difference in closure size across the cohorts.

The use of oral anticoagulants in patients undergoing Mohs and cosmetic flap surgeries results in significantly longer intraoperative times. Further investigation of this relationship and consideration of this finding may influence management of dermatologic and cosmetic procedures.

1Loma Linda University Adventist Health Sciences Center, Loma Linda, CA

2Loma Linda University School of Medicine, Loma Linda, CA

New persistent opioid use is recognized as a complication in both major and minor surgeries. Prolonged postoperative pain is often the impetus for patients seeking renewal of opioid prescriptions that can lead to persistent use and substance abuse. Clinicians have responsibilities to provide adequate pain relief and limit harmful unnecessary opioid use. By developing alternative analgesic pathways that are as effective as the opioid-inclusive analgesic protocol in managing post-operative pain levels, surgeons can decrease surgical patient’s need and access to opioids in an uncontrolled home environment. Buprenorphine, a mu receptor agonist, has been used for acute and chronic pain management since U.S FDA approval in 1981. However, few studies have tested its efficacy in perioperative administration.

Patients with localized prostate cancer scheduled for RALP were recruited to receive either of two pathways. Forty patients received the standard opioid pathway, and forty-one patients received the buprenorphine-inclusive pathway. In this novel pathway, intravenous buprenorphine was administered intraoperatively and as needed postoperatively. Post-operative analgesic management was as standard, while avoiding non-buprenorphine opiates. Patients were administered a questionnaire regarding their post-operative complications, pain level at discharge, and at-home analgesics used at five days post-op to monitor pain control. Our primary endpoint was adequate pain control, and our secondary endpoints were analgesic consumption at home, opioid-related side effects, and patient satisfaction.

There was no difference between the buprenorphine group and the conventional group in length of stay (1.1 vs 1.3 days, p=0.18), pain control (0–10 scale) at the time of discharge (5.2 vs 5.7, p=0.4) and overall patient satisfaction (p=0.1). Our study demonstrates buprenorphine’s analgesic capabilities to maintain non-inferior levels of pain control, length of stay, and patient satisfaction comparable to patients on opioid-inclusive analgesia during RALP while decreasing post-surgical and home opioid use.

By markedly decreasing post-surgical opioids prescriptions, we can reduce the risk opioid addiction and the associated harm to the patient. This study is a proof of principle that buprenorphine use for perioperative analgesia during RALP is an alternative to traditional opioid-inclusive analgesic pathways. We believe such a strategy will decrease the incidence of opioid use disorder and have benefits including less associated healthcare spending, improved patient health and reduced social harm.

1University of California Davis School of Medicine, Sacramento, CA

2University of California Davis Health System, Sacramento, CA

This study compares the outcomes, features, and costs of laminectomy and fusion (LEF) versus laminoplasty (LP) as surgical treatments for patients with cervical spondylotic myelopathy (CSM).

Elective LEF and LP procedures performed at a single institution between 2014 and 2020 were identified. Included patients had no prior cervical spine surgery. All patients received pre- and postoperative outpatient evaluations in the outpatient clinic. Only procedures involving three or more spinal levels were included. Clinical data was collected from electronic medical records. SPSS 27 was used for statistical analysis. Hospital costs were obtained from hospital billing for a subgroup of patients for whom this information was available.

135 patients were included: 76 underwent LP and 59 underwent LEF. Mean follow-up time was 14 months. Compared to LEF, LP procedures involved fewer levels (4.2 vs 4.8 levels, p < .001) and trended shorter operative time per level (47 vs 62 minutes, p < .001). Intraoperative blood loss and fluid replacement were similar between groups (p = .79 and p = .08). Patients in the LP group were discharged an average of 1.1 days earlier (p = .001). LP was not associated with higher rates of C5 palsy (p = .28). Patients who underwent LEF were five times more likely to develop wound infection or dehiscence (risk ratio = 5.2, 95% CI:1.1 to 23.4). Postoperative ground-level falls requiring an emergency department (ED) visit occurred more frequently in the LEF group (11.9% vs 2.6%, p = .04). The frequency of ED visits for postoperative neck pain did not differ between groups (p = .42). Likewise, rates of new-onset neck pain were similar (p = .45). Both groups reported improved VAS neck pain over the course of follow-up (p = .001). Surgery type, involvement of the C7 level, and the number of levels involved were not predictive of differences in postoperative neck pain (p = .66, p = .31, and p = .87). Opioid analgesic needs in the year before and the year after surgery were similar between groups (p = .41 and p = .33). The LP cohort had greater preoperative cervical lordosis (C2-C7 cobb angle: 11.69 vs 6.59, p = .01) and lost more lordosis postoperatively (-7.9 vs -1.8, p = .004). LEF cases at this hospital incurred 18% and 34% greater fixed and variable costs (p = .03 and p < .001).

When used to treat patients with multilevel CSM, LP does not seem to be associated with new or worsening axial neck pain compared to LEF. Neck pain may be expected to improve similarly with either surgery. When cervical deformity is not prohibitive, LP could be offered as a less morbid and more cost-efficient alternative to LEF. Modern patient-reported outcomes and randomized controlled trials are still needed to optimize the utility of both procedures.

2University of Rochester, Rochester, NY

3University of Washington School of Medicine, Seattle, WA

4University of Washington Medical Center, Seattle, WA

Diabetes Mellitus (DM) has a significant burden in the United States and results in worsening health outcomes. Patients are at risk of peripheral neuropathy, which increases the risk of lower extremity burns, delay in burn presentation, and more complications that translate to more amputations. However, there are limited reports regarding the incidence and outcomes of DM foot burns. We aim to better understand DM health outcomes, specifically lower limb amputations, in DM foot burns at national Level 1 and 2 trauma centers.

Implementing a retrospective cohort study design, we reviewed de-identified data on 116,796 adult admissions from 2007–2015 from the National Trauma Databank (NTDB) for patient age, DM, foot burn status, sex, race/ethnicity, region, burn size, and comorbidities. An exploratory logistic regression of factors associated with lower limb amputations was performed.

Of the 7,963 (7%) foot burn patients, 1,338 (17%) had DM (median age 56 years [17]) and 378 (28%) were male. Common comorbidities included alcohol use, smoking, and chronic kidney disease (table 1). Using an exploratory logistic regression analysis, when all other variables were kept the same, factors (OR, CI) linked with total lower limb amputations are DM (3.70, [2.98, 4.59]), alcohol use (2.78, [2.13, 3.61]), smoking (0.78, [0.62, 1.00]), chronic kidney disease (2.90, [1.72, 4.88]), burn size>20% (4.12, [2.96, 5.73]), African-American/Black race (1.61, [1.29, 2.01], male sex (1.61, [1.28, 2.02]), and age>40 years. There was a higher rate of lower limb amputations in patients with DM foot burns.

There is a higher rate of total amputations in DM foot burn patients, which indicates the need for increased patient education and treatment protocols that address the incidence of complications in this group. Future steps include confirmatory research to assess the risk of lower limb amputations in DM foot burn patients.

NTDB DM foot burn cohort characteristics (2007–2015)

Rate per 10,000 of total lower limb amputations increases in DM foot burn vs. non-DM foot burn populations from 2007–2015.

1Santa Clara University, Santa Clara, CA

2Cedars-Sinai Smidt Heart Institute, Los Angeles, CA

The COVID-19 pandemic infected large portions of the US community and infected many heart transplant (HTx) patients, but in distinct geographical patterns. HTx programs have reported mortality in the range of 23–29% and in non-transplant patients in the range of 15–17%. The impact of hospitalized HTx patients with COVID infection in a large West Coast heart transplant program has not been reported. We now report our outcomes for hospitalized patients with COVID.

Between March 2020 and March 2021, we assessed 22 HTx patients who were admitted to the Cedars-Sinai Medical Center (CSMC) for COVID infections. COVID is known to affect many systems within the body, and we report the effects on lungs, heart, and kidney. Morbidity and mortality, including risk of death, were included within 90 days post-infection.

Of the 22 HTx patients hospitalized at the CSMC, 7 patients died (31.8%). All patients had COVID pneumonia requiring supplemental oxygen and 5 patients required ventilatory support. The mean peak FiO2 of the patients was 79.7%. 16 of these patients also were noted to have an increase in serum creatinine, with 6 patients requiring kidney dialysis. Cardiac function was maintained in all patients with COVID-19 and no myocarditis or cardiac dysfunction was observed. 9 patients received remdesivir and 19 patients received corticosteroids. 4 patients received tocilizumab anti-inflammatory therapy.

COVID-19 resulted in significant morbidity and mortality in hospitalized HTx patients. The immunosuppressed state appears to be a risk factor for poor outcome and is higher compared to non-transplant hospitalized patients.

1University of California Los Angeles, Los Angeles, CA

2Cedars-Sinai Smidt Heart Institute, Los Angeles, CA

Primary graft dysfunction (PGD) is seen in approximately 7–29% of heart transplant (HTx) patients. Many of these patients with PGD also develop significant vasoplegia which requires high doses of intravenous vasoconstrictors. Outcomes of these patients with severe PGD is compromised within 30 days after HTx. Risk factors for the development of severe PGD have included angiotensin-converting enzyme inhibitors (ACEi). There may be a connection between ACEi and the kallikrein-kinin system whereby bradykinin is increased, thus resulting in more vasoplegia and PGD. It is not known whether the new drug sacubitril/valsartan (S/V) is also a risk factor for the development of vasoplegia/severe PGD as bradykinin is also increased with sacubitril. Therefore, we reviewed our large HTx program to see if there is a correlation of S/V as a risk factor for this complication.

Between 2015 and 2020, we assessed 65 HTx patients who were on S/V at the time of transplantation. Vasoplegia was defined as requiring more than 2 vasoconstricting drugs with BP systolic <90 mmHg, and PGD was defined as per the ISHLT classification scheme (within 24 hours post-transplant). These patients on S/V were compared to patients on ACEi/ARB (1:1 control group for age, sex, transplant year). Outcomes included death, cardiac dysfunction, and non-fatal major adverse cardiac events (NF-MACE: MI, new CHF, PCI, ICD/pacemaker, or stroke) in the first year after HTx.

Compared to ACEi/ARB, S/V had similar risk for the development of vasoplegia or severe PGD. Furthermore, 1-year survival, and 1-year freedom from cardiac dysfunction and NF-MACE were not significantly different between groups.

Comparison of Entresto vs. ACEi/ARB

Patients undergoing HTx on S/V do not appear to be at risk for vasoplegia or severe PGD.

1University of California Los Angeles David Geffen School of Medicine, Los Angeles, CA

2Cedars-Sinai Smidt Heart Institute, Los Angeles, CA

The calcineurin inhibitors (CNIs), including tacrolimus and cyclosporine, have revolutionized heart transplantation (HTx) in terms of maintaining low rejection rates. However, CNIs have significant side effects, such as nephropathy, hypertension, malignancy, and hypomagnesemia. It is this hypomagnesemia that has not been addressed as to whether this has an impact on outcome after HTx. Hypomagnesemia has been involved in muscle cramping and cardiac arrhythmias. Therefore, we reviewed our HTx patients and assessed magnesium (Mg) levels to assess outcome in the first 6 months after HTx.

Between 2010 and 2020, we assessed 956 HTx patients and recorded their Mg levels in the first 6 months after HTx. Patients with low Mg levels less than or equal to 1.8 mg/dL were assessed for complications including muscle cramping, cardiac arrhythmias, rehospitalization, rejection episodes, and death. Patients with low Mg levels were grouped into mildly low Mg levels (1.7–1.8 mg/dL) and moderately low Mg levels (1.4–1.7 mg/dL). Patients were compared to control patients who had normal Mg levels (>1.8 mg/dL) during this period of time.

Patients with mildly or moderately low Mg levels compared to patients with normal Mg levels had no difference in muscle cramping, rejection episodes, cardiac arrhythmias, and use of antihypertensive medications. Kidney function was abnormal in those patients with normal magnesium levels.

Mild-moderate hypomagnesemia did not have significant adverse effects in heart transplant patients in terms of muscle cramping, cardiac arrhythmias, cardiac rejection, or cardiac function.

1University of California Santa Barbara, Santa Barbara, CA

2Cedars-Sinai Smidt Heart Institute, Los Angeles, CA

It is not uncommon to have an acute abdomen following heart transplantation. Patients who have atherosclerotic vascular disease (coronary artery disease) as the need for heart transplant may also have risk for ischemic bowel associated with their surgeries. In addition, patients who have had gallstones are at increased risk for having cholecystitis immediately following cardiac surgery. It has not been well established as to the frequency of these abdominal complications that warrant urgent abdominal surgery. Furthermore, the presence of increased inflammation in abdominal surgery may trigger an immune response and thereby cause a rejection episode. We sought to evaluate these complications in our large, single center experience.

Between 2010 and 2021, we assessed 1154 patients who underwent heart transplantation and reviewed the frequency of acute abdomen requiring surgical intervention (11 patients) in the first month following heart transplant surgery. These patients were assessed for 30-day and 1-year survival, episodes of cardiac rejection, and infectious complications requiring intravenous antibiotics and/or readmission in the ensuing 3 months. The acute abdomen patients were compared to a case-controlled group, matched by age, sex, and time of transplant.

1.0% (11/1154 patients) of our patient population developed an acute abdomen and required surgical intervention within the 30 days following heart transplant surgery. Types of surgical interventions included hemicolectomy, cholecystectomy, and exploratory laparoscopy. Compared to the control group, the acute abdomen group had significantly worse 30-day survival and 1-year survival. In the study group, infectious complications occurred in an additional 36.4% of these patients who required rehospitalization with administration of intravenous antibiotics. Rejection episodes following these events was not different from the control population.

Acute abdomen immediately post-heart transplant resulting in urgent abdominal surgery requiring hemicolectomy and/or cholecystectomy has significant morbidity/mortality. For patients awaiting heart transplant with gallstones, prophylactic laparoscopic cholecystectomy might be considered.

1University of Trieste, Trieste, Italy

2University of Colorado Denver School of Medicine, Aurora, CO

Cardiomyopathies (CMP) are a heterogeneous group of heart disease characterized by structural and electrical abnormalities lacking secondary causative etiology and frequently related to mutations in CMP genes. Recent studies in this field have showed important phenotype overlaps between Dilated Cardiomyopathy (DCM) and Arrhythmogenic Cardiomyopathy (ACM), making the diagnosis a challenging task. The aim of this study is to assess whether a classification of CMP patients (not hypertrophic) based on genetic characterization outperforms in diagnostic and prognostic accuracy the classical, phenotype-driven, diagnostic approach.

We analyzed a population of patients affected by genetically determined DCM and ACM, including carriers of ‘pathogenic’ or ‘likely pathogenic’ (P/LP) variants, registered into the Heart Disease Centers of Trieste and Denver hospitals. We described the phenotype distribution in our population with a clinical and echocardiographic evaluation based on the different disease-related mutated genes. Then, we examined the prognostic impact of the single gene/genetic cluster in determining these outcomes: 1) all-cause mortality and heart transplant; 2) heart failure-related death, heart transplant or destination left ventricular assist device implantation (DHF/HTx/VAD); and 3) sudden cardiac death, sustained ventricular tachycardia/ventricular fibrillation or appropriate defibrillator shock (SCD/VT/VF/shock).

281 patients carrying P/LP variants (82% DCM) were included in the study. Titin (TTN) and sarcomeric genes (SARC) variants were the most prevalent (TTN: 95 patients, 34% of total population; SARC: 63 patients, 22% of total population) and almost completely related to DCM phenotype (TTN: 100% DCM, SARC: 95% DCM); lamin (LMNA) patients (29 patients, 10% of total population, 96% DCM). A more heterogeneous phenotypic distribution between DCM and ACM were noted for desmoplakin (DSP), plakoglobin (PKP2) and filamin (FLNC) variants. Patients with uncategorized DCM phenotype and carriers of DSP, PKP2, FLNC and LMNA variants (arrhythmic genes) experienced more frequent SCD/VT/VF/shock events (p value=0.002 and p=0.023), compared to patients with DCM phenotype, during follow-up (median=132 months). The analysis shows only P/LP variants of arrhythmic genes, early age of onset and male gender, were associated with an increased risk of SCD/VT/VF/shock events during follow-up. Additionally, there were no differences in terms of HF events was significantly related to genotype.

In a large DCM and ACM population with a positive genetic test for P/LP variants, the classification based on specific genotypes is a useful tool in arrhythmic prognostication . These findings support the need of extensive genetic testing to support CMP diagnosis and prognosis.

1Western University of Health Sciences College of Osteopathic Medicine of the Pacific, Pomona, CA

2Cedars-Sinai Smidt Heart Institute, Los Angeles, CA

For patients awaiting heart transplantation (HTx) who have high levels of circulating antibodies (greater than 70%), desensitization therapy may be indicated. This will allow expansion of the donor pool for a compatible donor. As women appear to be more highly sensitized (due to multiple pregnancies), it is not clear as to whether women can benefit from desensitization therapy. We sought to answer this question with review of our large, single center database.

Between 2008 and 2020, we assessed 49 patients awaiting HTx who underwent desensitization therapy. These patients were divided into groups by sex for their response to desensitization therapy. Our desensitization protocols consist of regimens including intravenous immune globulin, anti-CD20 monoclonal antibody, plasmapheresis, and/or proteosome inhibitors. A response to desensitization therapy was assessed by the decline of the dominant circulating antibody determined by mean fluorescence intensity (MFI). Post-HTx data was assessed for 1-year survival and freedom from rejections (acute cellular rejection [ACR], antibody-mediated rejection [AMR]). Rejection episodes were compared to a control group of non-sensitized patients transplanted during the same period (n=771).

Desensitization therapy in women appeared to be comparable to men, considering similar desensitization protocols. There were no significant differences in waitlist mortality, time on the waitlist, 1-year post-transplant survival, or 1-year freedom from ACR or AMR between the two groups. Compared to non-sensitized patients, freedom from AMR was significantly lower in both sensitized men and women (72.7% men vs. 78.9% women vs. 96.5% control group, p≤0.001).

Sensitized women awaiting HTx compared to men appear to have similar response to various desensitization regimens. Post-HTx, there was more AMR in both groups, suggesting memory B-cells may be responsible.

Kern Medical Center, Bakersfield, CA

Reports of cardiovascular manifestations in the setting of COVID-19 have included arrhythmia, pericarditis, heart failure, acute coronary syndrome, and myocarditis. Myocarditis is defined as inflammation of the heart muscle and is commonly associated with viral infection. Common symptoms of myocarditis can include chest pain, shortness of breath, as well as arrhythmia and fatigue. While endomyocardial biopsy remains the gold standard for diagnosis, clinically suspected myocarditis in low-risk patients can be established through presentation and non-invasive diagnostic findings. Here, we aim to highlight the association between coronavirus disease of 2019 infection (COVID-19) and cardiovascular complications such as myocarditis in this case report. In our case, heart catheterization demonstrated 60% stenosis of the proximal left anterior descending artery; however, this lesion was not suspected to be the culprit lesion causing myocardial injury. Etiology of injury was thought to be caused by global ischemia in the setting of post-COVID-19 infection.

Retrospective chart review after IRB approval.

This is a 39-year-old Hispanic female with history of PCOS, hyperlipidemia, hypertension, Oral Contraceptive Pill, provoked Deep Vein Thrombosis and Pulmonary Embolism on Rivaroxaban, who presented to emergency department with 4 days of new onset intermittent severe substernal chest pain radiating down to her left arm. She had SARS-CoV-2 pneumonia the month prior to this presentation significant for cough, anosmia, and myalgias, which resolved without hospitalization. On arrival, she was hypertensive, tachycardic, and afebrile. Coagulation panel was normal, troponin-I was elevated at 6.25 with a peak of 9.27. Toxicology was negative for stimulants. She tested positive for SARS-CoV-2 but remained asymptomatic. Patient was started on dual anti-platelet therapy and anti-coagulation therapy. Repeat ECG showed no new changes. A second episode of chest pain revealed lateral ST-elevations and Q-waves in inferior leads. Troponin continued to downtrend. Left heart catheterization was performed with incidental finding of 60% stenosis of the proximal LAD with a smooth plaque. This patient clinically improved without further chest pain and was discharged with dual-antiplatelet therapy.

Evaluation and tracking of clinically suspected myocarditis in the setting of COVID-19 infection may give insight into the pathophysiology of infection in cardiomyocytes due to SARS-CoV-2. This case report aims to illustrate the possible association between COVID-19 and myocarditis in the hopes of decreasing morbidity and mortality.

2Ross University School of Medicine, Miramar, FL

Hypertrophic cardiomyopathy (HCM) is known to have a wide spectrum of patterns. This case highlights an uncommon form of HCM called apical hypertrophic cardiomyopathy (ApHCM) which was seen to mimic myocardial infarction.

A 46-year-old Punjabi male with hypertension presented to an outside hospital with chest pain and was to have elevated troponin levels of 0.31 ng/mL. Nuclear Lexiscan stress test at that time showed ‘reversible defect of the cardiac apex suggestive of ischemia’, cardiac catheterization was negative, and transthoracic echocardiogram (TTE) showed preserved left ventricular function and mild mitral regurgitation. Troponin trended down to 0.23 ng/mL and the patient was discharged.

Patient then comes to the medicine clinic to establish care and was complaining of palpitations that are intermittent and last about 2–3 minutes per episode. Patient reports that the episodes are initiated by physical activity such as walking about 100 feet and alleviated with rest. Patient denied any chest pain or shortness of breath. Positive history for heavy alcohol use, drinks 6–8 alcoholic beverages 2–3 times a week. Electrocardiogram (ECG) done in the clinic showed left ventricular hypertrophy and abnormal T waves in inferior leads. Repeat TTE showed left ventricular ejection fraction is estimated at >65% and apical to mild LV is unusually thickened which is consistent with ApHCM. Patient was then referred to the cardiology clinic for further management. The patient will be treated with appropriate beta-blocker and cardiac monitoring for further risk stratification.

There are many different spectrums to hypertrophic cardiomyopathy with the most common form being asymmetric septal hypertrophy (ASH). There is a more rare form called ApHCM which is more prevalent in the Asian population (25%) than in non-Asians (1% to 10%). Compared to the ASH, it is more sporadic and associated with more atrial fibrillation (AF) and different risk factors for sudden cardiac death (SCD). There are no current guideline recommendations for diagnosis, screening, or patient risk stratification available for ApHCM.

This case illustrates the importance of understanding and diagnosing patients with ApHCM since patient symptoms mimicked a myocardial infarction. The accurate and timely diagnosis may highly improve the clinical outcome and overall well-being of the patient.

1INSIGHTS Consortium, Overland Park, KS

2Children’s Hospital of Colorado, Aurora, CO

3University of Colorado – Anschutz Medical Campus, Aurora, CO

4The Children’s Hospital of Philadelphia Division of Endocrinology and Diabetes, Philadelphia, PA

5Ann Robert H Lurie Children’s Hospital of Chicago, Chicago, IL

6National Institute of Child Health and Human Development, Bethesda, MD

7Turner Syndrome Global Alliance, Overland Park, KS

8Children’s National Hospital, Washington, DC

9University of North Carolina at Chapel Hill School of Medicine, Chapel Hill, NC

10The University of Texas Health Science Center at Houston, Houston, TX

Turner syndrome (TS) occurs in ~1 in 2,000 females who are born with partial or complete absence of the second sex chromosome. Like many rare disease conditions, most research in TS has been focused on specific features (particularly growth), limited to single centers, included minimal diversity, and lacked community engagement. The Inspiring New Science in Guiding Healthcare in Turner Syndrome (InsighTS) Registry was developed to address these limitations.

A Steering Committee with stakeholders comprised of researchers, multidisciplinary clinicians, and patient advocates was formed to develop the goals, infrastructure, data collection tools, protocols and engagement strategies for a national, collaborative clinic-based longitudinal registry for individuals with TS. Six institutions with multidisciplinary TS clinics across geographical regions were onboarded as recruitment sites with the goal of >80% of eligible patients enrolling with diversity in age, race, ethnicity, payor status, and timing of diagnosis. The team identified patient-centered multidisciplinary outcomes obtainable through medical records and optional additional study procedures.

To date, 154 participants representing all regional centers have enrolled in InsighTS with an average enrollment rate of 15 per month. The average age at enrollment of 11.9 ± 11 years (range 0–67, 16.9% ≥18 at enrollment). 18.5% identify as Hispanic/Latinx ethnicity and racial distribution includes 6.2% Asian, 13.7% Black, 71.9% White, and 11.0% Other Race. TS was identified prenatally in 30.3% of participants. The majority of participants agreed to be contacted for future studies (89%), complete annual surveys (83%) and contribute to the biobank (61%).

Stakeholder engagement for the development of a national clinic-based registry for the rare genetic condition of TS has successfully led to a diverse cohort representative of the US population. Additional engagement strategies to increase enrollment while prioritizing diversity are underway.

1Children’s Hospital of Los Angeles, Los Angeles, CA

2UC Irvine School of Medicine, Orange, CA

3Children’s Hospital of Orange County, Orange, CA

The COVID-19 pandemic has disproportionately impacted children from low socioeconomic and minority groups. Parents encounter new decisions regarding vaccinating their child against COVID-19 and return to school in fall of 2021. Prior studies show COVID-19 vaccine hesitancy is associated with income, race, and marital status. However, few studies examine the demographics behind COVID-19 vaccine hesitancy in relation to return to school in vulnerable communities. Understanding both are crucial to addressing challenges for children with healthcare inequities.

A cross-sectional survey was conducted at inpatient and outpatient settings at an academic center and its affiliated site between September 2020 - September 2021. Parents were recruited to complete an anonymous mobile-phone based survey using REDCap regarding perspectives on COVID-19 vaccines and factors affecting children’s return to school during the pandemic. Statistical analyses were performed to examine the association between demographic factors (gender, marital status, education, ethnicity, and household income), COVID-19 vaccine hesitancy, and healthcare inequities affecting return to school.

Of 189 respondents, 65.5% were married, 41.9% had less than college education, and 37.0% had households of > 2 people. 64.6% were minorities and 53.9% were from low income families. COVID-19 vaccine acceptance was positively associated with marital status and number of household members: 60.9% of married individuals reported they would vaccinate their child compared to 30.4% of unmarried individuals (p = 0.001). 62.1% of households of > 2 people would vaccinate compared to 43.1% with households 2 or less (p = 0.015, table 1). Those who accepted or rejected COVID-19 vaccines were more likely to prefer onsite school compared to those who were unsure (p = 0.020). Education, ethnicity, and income were not associated with COVID-19 vaccine acceptance (table 1) or parental decisions in having their child return to school. Those with less than college education, low income families, and minority groups favored returning to school because of school-provided lunches and availability of internet.

COVID-19 vaccine hesitancy and demographic factors

Our study shows that parents from all ethnicities and incomes may experience hesitancy towards COVID-19 vaccines. COVID-19 vaccine acceptance is positively associated with marital status and number of people in the household. Parents from vulnerable communities experience barriers influencing their decision of sending children back to school. Larger studies are needed to examine the underlying demographic factors behind COVID-19 vaccine hesitancy and return to school. Unique interventions are needed to target children experiencing healthcare inequities in order to increase COVID-19 vaccine confidence and promote safe return to school.

1Nova Southeastern University, Fort Lauderdale, FL

2YMCA of South Florida, Fort Lauderdale, FL

Black individuals in the United States have historically faced barriers to accessing healthcare, only exacerbated by the COVID-19 pandemic. The purpose of this study was to examine the self-reported likelihood of utilizing care within this population in the midst of the COVID-19 pandemic.

Housing authority residents in Broward County, Florida were asked about their likelihood of visiting their doctor during the COVID-19 pandemic as part of a COVID-19 testing and education initiative conducted by the YMCA of South Florida, in partnership with the Housing Authority of the City of Fort Lauderdale and the Broward County Housing Authority. Secondary data analysis of program data was conducted, including descriptive statistics for describing respondents, Chi-square and t-tests for detecting significant differences around likelihood of seeking care between groups, and logistic regression to determine the odds of particular groups’ likelihood of seeking care.

Significant differences were found between respondents (n=147) reporting they were more likely to visit their doctors in terms of race/ethnicity (X2 [1,n=147] = 8.15, p < .01). Black respondents had three times the odds of claiming to be more likely to visit their doctor (aOR 2.76, 95% CI 1.36–5.60) than other groups. However, Black respondents reported being significantly more afraid of contracting the virus that causes COVID-19 on the way to the doctor’s office than non-Black respondents (X2 [1,n=147] = 4.23, p < .05). Black respondents also reported being more concerned of contracting the virus that causes COVID-19 at the doctor’s office than non-Black respondents (X2 [1,n=147] = 5.29 p < .05).

Black Housing Authority residents seemed to have a high likelihood of utilizing care if needed. However, this high utilization is coupled with a fear of contracting the virus that causes COVID-19 in the process of utilizing care. Areas for further research include determine the rationale behind this positive attitude toward healthcare utilization and figuring out specific areas of concern for contracting COVID-19 in the process of utilizing care (ex. fears of contracting the virus through the use of public transportation on the way to the clinic, etc.).

1Eastern Virginia Medical School, Norfolk, VA

2University of Southern California Norris Comprehensive Cancer Center, Los Angeles, CA

3NCI Center to Reduce Cancer Health Disparities, Bethesda, MD

4University of Southern California Keck School of Medicine, Los Angeles, CA

5NCI Center to Reduce Cancer Health Disparities, Bethesda, MD

Hispanic populations experience disparities with regards to human papillomavirus (HPV) vaccine uptake despite ranking highest among racial groups for rates of cervical cancer. It is well-established that HPV vaccination confers a high degree of protection against HPV-related cancers. Yet barriers to HPV vaccination contribute to low rates of vaccine initiation and series completion in Hispanic populations with only 35–46% of adolescents fully vaccinated against HPV. Notably, literature suggests low health literacy as a common deterrent to vaccine uptake. The purpose of this study is to assess the utility of educational workshops in the improvement of HPV literacy in Hispanic populations in the Los Angeles area.

Educational interventions, consisting of a one-hour slideshow presentation delivered via Zoom video platform, were conducted from August 2020 to April 2021 and addressed the clinical significance of HPV and the HPV vaccine. Study participants (n=92) were recruited through community partners and by word-of-mouth in a snowball fashion. A pre- and post-test design was used to investigate study participants’ knowledge of HPV and the HPV vaccine before and after educational intervention. Lecture presentations and pre-/post-questionnaires were offered in English and Spanish, depending on the participant’s language preference. A paired t-test was used to compare HPV and HPV vaccine literacy scores pre- and post-lecture. Analysis was completed using SPSS Statistics Version 28.0.

Of the 92 study participants who attended the lectures, 58 (63.0%) completed both the pre- and post-questionnaires. Prior to educational intervention, HPV and HPV vaccine literacy scores for study participants were 70.4% and 70.7% accuracy, respectively. Post-lecture, HPV and HPV vaccine literacy scores increased to 83.1% and 77.8% accuracy, respectively, demonstrating a statistically significant improvement (p<0.001) in HPV literacy with educational intervention.

The implementation of educational lectures was effective in improving HPV and HPV vaccine literacy in a Hispanic population. These findings demonstrate the need for increased emphasis on HPV education to close the knowledge gap disproportionately affecting Hispanic populations. Future research should focus on vaccine acceptability post-educational intervention in Hispanic populations to identify possible strategies to improve HPV vaccination rates. A potential explanation behind the diminished response rate may be attributed to hesitancy among undocumented study participants to reveal personally identifiable information.

1Loma Linda University School of Medicine, Loma Linda, CA

2Loma Linda University, Loma Linda, CA

It’s evident in academic literature that representation of underrepresented in medicine (URiM) minorities in Emergency Medicine (EM) is sparse. This disparity is more drastic amongst EM leadership. Faculty and residents are directly involved in recruiting, interviewing, and ranking potential incoming residents. The lack of URiM participation in that process impacts the potential for future URiM physicians to be appointed to EM residency leadership positions. Our study sought to identify potential areas for increased representation in the future and factors that may increase URiM involvement.

We administered a survey to the U.S. Emergency Medicine Residency Program Directors (PDs) listed on FREIDA, the American Medical Association (AMA) residency and fellowship database. We drafted and piloted the online survey instrument before sending it to participants via Qualtrics. Survey items focused on ethnic identity in program leadership, career preparation such as mentors and previously held roles, and strategies used to encourage URiM recruitment. Participants received one announcement email and three reminder emails following the survey distribution. We used Microsoft Excel for primary data analysis.

We received 57 completed surveys. 22% of the respondents identified as URiM, of which 9% identified as Black and 7% identified as Latinx. The median percentage of Residents identifying as URiM was 13% (IQR 1%-32%). Eight programs (14%) reported having at least one Chief Resident identifying as URiM. 72% of respondents reported that a mentor was instrumental in their ascension to PD. 11% reported that their mentor identified as URiM. We asked PDs to confirm which strategies they’ve implemented to encourage URiM participation (Boatright 2008). The most commonly implemented strategies were, ‘Know the institution’s local and community demographics, and address those needs’ (51%), followed by ‘Broaden selection criteria beyond USMLE scores to include intangibles such as leadership, community service, and other life experiences’ (49%), and ‘Develop curricula to address topics on diversity, cultural competence, and implicit bias’ (47%).

The disparity of URiM PDs in EM may be a result of a lack of URiM mentorship. 29% of respondents were URiM but only 11% reported having a URiM mentor. This lack of mentor-mentee concordance may be an area of further study and improvement. More intentional utilization of URiM recruitment strategies could also drastically improve representation. Increased URiM participation in EM leadership has great potential to improve diversity, equity, and inclusion in EM overall.

*Boatright N, et al. The Impact of the 2008 Council of Emergency Residency Directors (CORD) Panel on Emergency Medicine Resident Diversity. DOI: 10.1016/j.jemermed.2016.06.003.

1University of California San Diego School of Medicine, La Jolla, CA

2Rady Children’s Hospital San Diego, San Diego, CA

Medical education health equity curriculums rarely emphasize advocacy and community engagement, further exploiting the minority tax in pursuing health equity work. Health equity curriculums must include three components: history, outcomes and interventions. The Journal Club and Advocacy Lab (JC-AL) schema was added to the Health Equity Thread (HET) preclinical curriculum at UC San Diego (UCSD) School of Medicine to teach and support interventions to health disparities.

Preclinical students receive HET credit by attending JC-ALs. JC-AL workflow is depicted in figure 1; the JC and AL are held 1–2 weeks apart. Participants took a survey, approved by the UCSD Institutional Review Board, before the JC and after the AL. Survey responses from November 2020-June 2021 were gathered and summarized for each timepoint using R.

Of participants surveyed, 141 (28.5%) identified as underrepresented in medicine. About a quarter of participants saw an increase in mood (25%), resilience (27%), sense of community (24%) and/or motivation (29%) regarding health equity work after the intervention. 158 participants (67.2%) reported being somewhat or very likely to stay involved in the advocacy project, and 93 participants (39.6%) reported being likely to lead a session in the future. Almost all of the JC-ALs have manifested long term projects including:

- Educational material for healthcare providers and preclinical students regarding removal of race from eGFR and adoption of cystatin C and addition of cystatin C in UCSD Health Laboratory Medicine Formula

- Learning modules for preclinical students on gender affirming and trauma informed care

- Elective on obtaining a health equity history in the emergency department

The JC-AL schema is a feasible approach to engage trainees in the community and institution to enact change. It is a well-received component of the HET ranging from 30–100 participants at each event.

University of Colorado, Denver, CO

There is very little data published exploring the impact that racial or sexual minority identity has on a resident’s training experience. Given that a high percentage of internal medicine training programs are predominantly white it’s important to understand the emotional and supportive barriers minority residents face. We began the important work looking into these barriers with a survey-based needs study.

174 residents enrolled in the University of Colorado Internal Medicine Residency Program were asked to participate in an online survey. This survey consisted of several validated instruments including: the PHQ-4, the MOS Social Support Survey, and the UCLA Loneliness scale. The survey included a demographics section and each respondent utilized a unique PID to maintain anonymity. 65 out of 174 residents responded to the survey. The answers to the survey were coded and scored per the original publications. Analysis of the data was done using two tailed T-Tests in the SAS software.

The average MOS total support score was significantly lower in LGBTQ+ residents compared to Non-LGBTQ+ residents (Mean 65.80 vs. 79.16; P = 0.035). LGBTQ+ residents also trended towards having higher amounts of burnout, though this wasn’t statistically significant (Mean 0.38 vs 0.17; P = 0.19). Notably 3 out of 9 LGBTQ+ residents reported feeling burnt out compared to 9 out of 56 Non-LGBTQ+ residents (33% vs 16%). Notably most of the significant findings were amongst single vs non-single residents with significance in: UCLA loneliness scale (P = 0.03), MOS total support score (P <0.0001), MOS emotional support score (P = 0.008), MOS affectionate support score (P <0.0001), MOS tangible support score (P <0.0001), MOS positive interactions score (P = 0.001) and PHQ-Depression sub-domain (P = 0.025). Racial minority residents had lower average levels of burnout compared to non-minority residents (Mean 0.12 vs 0.23; P = 0.32). However, racial minority residents had lower average levels of overall social support compared to non-minority residents (Mean 73.16 vs 78.74; P = 0.270) with the tangible support subdomain score being the closest for significance (Mean 13.55 vs 15.93; P = 0.120).

The sample size for the survey-based study was smaller than anticipated. However, it was large enough to find significance for LGBTQ+ residents having less social support, and also revealed higher levels of burnout. It’s also important to note that while it didn’t reach significance minority residents experienced lower average levels of overall social support. Surprisingly minorities had fewer burnout numbers which may be an indicator of increased resilience or utilization of protective mechanisms. Further reasearch needs to be conducted to better understand the needs of LGBTQ+ and racial minority residents. Future directions include expansion nationwide to gather a larger sample size and assess for geographic differences.

1Loma Linda University School of Medicine, Loma Linda, CA

2Kaiser Permanente Baldwin Hills, Los Angeles, CA

3University of Southern California, Los Angeles, CA

4University of California Irvine, Irvine, CA

5University of California San Diego, La Jolla, CA

6Loma Linda University School of Medicine, Loma Linda, CA

To analyze the presentation, disease course, and treatment of idiopathic subglottic stenosis of non-Caucasian women.

In this multi-institutional retrospective study, information extracted included date of birth, age at symptom onset, age and date of diagnosis, race, Cotton Meyer grade, stenosis length and distance from glottis, BMI, comorbidities, medication to manage iSGS, age at first surgery, additional treatment with serial intralesional steroid injections, the date of each surgery, occupation, autoimmune labs, and family history of autoimmune diseases.

35 non-Caucasian women with idiopathic subglottic stenosis were identified. Of the 35 women, 31 were Hispanic while one was African-American, two were Asian, and one was non-Hispanic mixed race. Their average BMI was 31.8 ± 2.19 kg/m2 and 51.4% of the patients were obese (BMI>30). 31.4% had hypertension. Their average age of onset was 45.8 years old (95% CI, 42.2–49.3) with a range of 26–69 years old. The average age at diagnosis was 47.8 years (95% CI, 44.3–51.3) with a Charlson comorbidity index of 0.85 (95% CI, 0.42–1.28). At diagnosis, 13.4% were CM I, 43.3% were CM II and 43.3% were CM III (n=30). The average age at their first surgery was 46.8 (95% CI, 43.2–50.4) years and 17 received SILSI. While treatment type varied given this was a retrospective surgery, none of the 35 women received open reconstruction. 62.9% experienced disease recurrence after their first surgery with a median of 11 months between their first and second surgery and received an average of 2.5 surgeries.

Our results show that the non-Caucasian population does not differ from the majority Caucasian population published elsewhere in current literature on idiopathic subglottic stenosis, which calls into question the homogeneity of the disease and the need to adjust recruitment methods to include more people of color and provide a more accurate representation of the patient population.

1University of Washington School of Medicine, Seattle, WA

2University of Washington, Seattle, WA

Hidradenitis suppurativa (HS) is an autoinflammatory disease characterized by painful boils beneath the skin. While socioeconomic factors have been linked to HS individually, there has been no scoping review that synthesizes these correlations. Our objective was to assess the published data on the associations between HS and the factors of income, education, and work.

A search limited to English publications was conducted in PubMed, Embase, Web of Science and Cochrane from database origin to 07/26/21. The terms used were ‘hidradenitis’ combined with ‘socioeconomic,’ ‘insurance, ‘class, ‘disparities,’ ‘disparity,’ ‘education,’ ‘income,’ ‘work,’ ‘employment,’ ‘job,’ ‘insurer,’ ‘medicaid,’ or ‘professional activity.’ Eligible publications were peer-reviewed and examined the association between HS and income level, educational attainment, occupation class, employment status, work impairment, or insurance status. Records were evaluated by O.C. and K.D. In the event of a disagreement, another reviewer was available to resolve the discrepancy.

After duplicate removal, 413 records were screened by title/abstract. 79 full-text records were then assessed for eligibility and 33 articles met inclusion criteria. By manually searching article references, an additional 3 papers were included. 29 research articles, 6 reviews, and 1 case report from 13 different countries were qualitatively synthesized according to the defined categories of associations.

3 articles found that HS patients had lower income levels but one of those studies, after adjusting for age/sex, found that this was not significant. 6 articles elucidated an association between HS and lower educational attainment. An association between HS and lower class of occupation was found by 1 study, and 7 publications (6 articles, 1 review) demonstrated a higher probability of being unemployed as an HS patient. 16 articles, 1 case report, and 5 reviews discussed the association between HS and work impairment. A higher likelihood for HS patients to have government-funded insurance was found by 3 studies. 4 articles utilized a combination of the factors as measures of SES. 3 of them found associations between low SES and HS, while one Israeli study found the opposite.

Our qualitative synthesis demonstrates that HS globally is linked with lower income levels, reduced educational attainment, unemployment, work impairment, and government-funded insurance coverage. Though one study found that higher SES is associated with HS, this can be explained by their usage of dermatologist-diagnosed HS patients and the fact that in Israel, dermatology encounters require co-payments unlike primary care visits. Though the directionality between HS and lower SES cannot be determined from the current research, our work shows the importance of considering SES when treating HS patients.

1Western University of Health Sciences, Pomona, CA

2Loma Linda University, Loma Linda, CA

While 90% of former American Osteopathic Association (AOA) residency programs transitioned to Accreditation Council for Graduate Medical Education (ACGME) accreditation, surgical subspecialty programs such as otolaryngology (ENT) (62%) and ophthalmology (47%) struggled to gain accreditation. DOs have actively participated in serving underserved communities, and losing AOA surgical specialty programs may decrease access to surgical care in rural and non-metropolitan areas.

A directory of former AOA ENT and ophthalmology programs was obtained from the American Osteopathic Colleges of Ophthalmology and Otolaryngology-Head and Neck Surgery (AOCOO-HNS). A secured survey was sent out to16 eligible ENT and ophthalmology program directors. The survey contained both quantitative and qualitative aspects to help assess why these programs did not pursue or failed to receive ACGME accreditation.

12 of 16 eligible programs responded: 6 ophthalmology and 6 ENT program directors. 83% of respondents did not pursue accreditation (6 ophthalmology and 4 ENT programs), and 17% (2) were unsuccessful in achieving accreditation despite pursuing accreditation. Across 12 respondents, 58% (7) cited lack of hospital/administrative support and 42% (5) cited excessive costs and lack of faculty support as reasons for not pursuing or obtaining ACGME accreditation.

Percentage of remaining osteopathic ENT and ophthalmology programs from 2014–2015 to 2020–2021 academic year under the SAS

The survey results reflect financial issues associated with rural hospitals. Lack of hospital/administrative support and excessive costs to transition to the ACGME were key drivers in the closures of AOA surgical specialty programs. Considering these results, we have 4 recommendations for various stakeholders, including program directors, designated institutional officials, hospital chief medical officers, and health policy experts. These recommendations include expanding Teaching Health Center Graduate Medical Education to surgical subspecialties, identifying and learning from surgical fields such as urology that fared well during the transition to ACGME, addressing the lack of institutional commitment and prohibitive costs of maintaining ACGME accredited subspecialty programs in under-resourced settings, and reconsidering Centers for Medicare & Medicaid Services (CMS) pool approach to physician reimbursement.

University of California Irvine, Irvine, CA

Renal cell carcinoma (RCC) is the most common type of kidney cancer worldwide. Angiogenesis plays a major role in providing adequate blood flow and nutrients to promote tumor growth and RCC progression. While radiologists assess enhancement patterns of renal tumors to predict tumor pathology, to our knowledge, no formal scoring system has been created and validated to assess the level of neovascularity in RCC, despite its critical role in cancer metastases. In this study, we characterized and analyzed the level of angiogenesis in tumor-burdened kidneys and their benign counterparts. We then created and validated a scoring scale for neovascularity that can help predict tumor staging for RCC.

After Institutional Review Board approval, the charts of patients who had undergone surgery for RCC between January 13, 2014 and February 4, 2020 were retrospectively reviewed for inclusion in this study. Inclusion criteria were a diagnosis of RCC, simple/radical nephrectomy, pre-operative contrast enhanced computed tomography (CT) scans, and complete pathology reports. Neovascularity was scored on a scale of 0 to 4 where 0= no neovascularity detected, 1= a single vessel <3 mm wide, 2= a single vessel ≥3 mm wide, 3= multiple vessels <3 mm wide, and 4= multiple vessels ≥3 mm wide. Each patient was scored by a senior medical student and then validated by a board-certified abdominal radiologist. Statistical analysis was performed using RStudio® Version 3.5.1. Demographics and tumor characteristics were compared using a Kruskall-Wallis ANOVA or Chi-squared test; neovascular score was compared using a Wilcoxon Rank-Sum test. Statistical significance was considered as p < 0.05.

A total of 217 patients were included in this study. There was no significant difference in patient demographics between tumor stages. Additionally, the majority of tumor pathology was clear cell carcinoma, regardless of tumor staging. The average neovascularity score was 1.07 for pT1x tumors, 2.83 for pT2x tumors, and 3.04 for pT3x tumors. The average neovascularity score for the benign counterparts was 0.124, 0.385, and 0.458, respectively. There was a significant difference in neovascularity score between pT1x and pT2x tumors (p = 0.0046), pT1x and pT3x tumors (p < 0.0001), and benign kidneys and kidneys with RCC (p = <0.0001).

Our novel vascular scoring system for renal cell carcinoma demonstrates a significant correlation with RCC pathological tumor staging. This scoring system may be utilized as part of a comprehensive radiological assessment of renal tumors, potentially improving tumor characterization and clinical decision making.

1Loma Linda University School of Medicine, Loma Linda, CA

2Loma Linda University Department of Basic Sciences, Loma Linda, CA

3Loma Linda University Medical Center, Loma Linda, CA

4The University of Texas at El Paso, El Paso, TX

Filipino Americans (FA) are known to have higher rates of thyroid cancer incidence and disease recurrence compared to European Americans (EA). FA are also known to be two times more likely to die of thyroid cancer compared to EA. Epidemiological studies in California have shown that thyroid cancer is the second most common cancer among FA women. Currently, there are no studies that demonstrate the mechanism behind these discrepancies. Evidence shows a strong correlation between obesity and more aggressive forms of thyroid cancer; obesity has an increased frequency in FA populations. The exact connection between the mechanisms of obesity and cancer is poorly understood. This epigenetic phenomenon may be due to microRNAs (miRNAs), which post-transcriptionally regulate gene expression. Dysregulated miRNA profiles have been associated with various diseases including obesity and cancer. MiRNAs are linked to different types of cancer; tumor suppressor genes and oncogenes are subject to modulation by dysregulated miRNAs. No study elucidates the association of miRNAs to tumor staging or prognosis in thyroid cancer health disparities.

In this study, we determined miRNA expression profiles and found significant differences in the miRNA profiles between FA and EA thyroid cancer patients. Our pilot study showed several dysregulated miRNAs, from which we chose to assay dysregulated miR-4633–5p segments that are known to be associated with thyroid cancer signaling. We used QIAGEN’s miRNA extraction kit to obtain high-quality miRNA from paraffin-embedded thyroid tissues. We performed next-generation miRNA sequencing using equal number of FA and EA samples and identified the top ten significantly up- and down-regulated miRNAs from the pool of differentially expressed miRNAs by qPCR assays.

Our investigation demonstrated a 1.5–2-fold higher expression of an upregulated miR-4633–5p in FA versus EA miRNA samples (n=70) after normalized to controls. In contrast, miR-323b-3p showed no difference between FA and EA after normalized to controls.

For our future work, we plan to analyze multiple up- and down-regulated miRNAs by qPCR, determine whether the miRNA signatures are consistent between samples from FA versus EA, and explore the use of these miRNA signature differentials for affordable and rapid thyroid cancer screening and prognosis.

1Los Angeles County University of Southern California Medical Center, Los Angeles, CA

2Keck Hospital of USC, Los Angeles, CA

Melanoma of unknown primary (MUP) is clinically uncommon and is understudied as a disease. There have been studies evaluating the utility of local resection with radiation therapy for treatment of MUP. However, it has been only within the last few years that MUP has been routinely treated with targeted or immunotherapy.

We conducted a retrospective review of patients with MUP treated at LAC-USC Medical Center and Norris Cancer Center from 2008 to December 1st, 2020. We recorded the presentation, treatment course, and outcomes of each patient within our database. Data points collected include demographic information, clinical staging, size of largest metastases, location and number of metastatic sites. Treatment modalities, including metastatectomy, and systemic therapy were reviewed. The primary outcomes studied were median overall survival and 1-year overall survival.

Data was collected from 32 patients identified as having MUP. Sites of melanoma metastases included lymph node, soft tissue/muscle, lung, liver, brain/leptomeningeal disease, and bone. Thirteen patients (40.6%) were found to have one metastatic tumor, 6 (18.8%) were found to have 2–3 metastases, and 13 (40.6%) were found to have 4+ metastatic tumors on presentation. Two patients were lost to follow up shortly after their diagnosis. The 30 remaining patients had a 14.3-month median survival with 17 (56.7%) surviving past one year.

In total, 15 patients underwent surgical metastatectomy, and 18 patients were treated with immunotherapy. With all patients surviving after one year, the 13 patients who had a complete resection of their tumor burden were noted to have a median survival time of 37.0 months following their diagnosis compared to a 2.2-month median survival among the 17 patients who did not have surgery or who had incomplete resection of tumor. The 18 patients treated with immunotherapy (PD-1 ± CTLA-4 inhibition) were found to have a median survival time of 23.7 months with 16 (88.9%) surviving past one year. Eleven patients who were treated both with complete surgical resection and adjuvant immunotherapy, were found to have a median survival time of 35.0 months with 11 (100%) surviving past one year. When analyzing outcomes of patients with MUP based on the number of metastases, number of organs involved, and largest size of metastases, survival was correlated with less than 4 metastases and less than 2 organs involved. Size of largest metastases had no effect on survival outcomes.

Outcomes among patients with MUP may vary, depending on treatment modality, and tumor burden. Based on our data, patients who have MUP with low burden of disease may benefit from multi-modality therapy, including both surgical metastatectomy, and immune checkpoint blockade. Further validation using larger cohorts is warranted to help confirm these findings.

1UCSF Benioff Children’s Hospital Oakland, Oakland, CA

2University of California San Francisco, San Francisco, CA

Multiple mechanisms may give rise to biallelic variants in NF1-related tumors. Deletion and copy-neutral loss of heterozygosity (LOH) are potential mechanisms of somatic NF1 loss, distinct from point mutations. Tumor multi-gene sequencing demonstrates co-mutations in genes in addition to NF1, which may be tumor dependent and which may help molecularly classify tumors seen in NF1. This study asks whether excised tumors from individuals with NF1 demonstrate additional gene variants and differentiates first and second hits in NF1 using paired germline and somatic sequencing.

The hypothesis is NF1 second hits and co-mutational patterns may be found by analyzing cancer driver genes. To test this hypothesis, data from 6381 tumors previously sequenced on a 529- cancer gene panel were analyzed to yield 391 NF1-mutated tumors. LOH analysis over NF1 was done for all cases.

NF1 LOH was common, seen in 133/391 tumor samples. There were 40 tumors from individuals with constitution NF1. Tumors from individuals with constitutional NF1 had more prevalent copy neutral LOH (p-value <0.0001, two proportion z-test), suggesting somatic intrachromosomal recombination. Osteosarcoma was noted in association with NF1 with copy-neutral LOH, adding to accumulating reports of this rare tumor in NF1. NF1-associated MPNST versus non-NF1-associated MPNST, harbored co-mutations in TP53 as well as CDKN2A/2B deletion. Additionally, NF1 second-hit data from tumors were informative for annotating missense variants that were conflicting in ClinVar, potentially helping to improve NF1 annotation. The results provide an additional 162 deleterious NF1 variants to add to current gene annotation efforts.

Sequencing of paired tumor and normal samples in NF1-associated tumors uncovers a spectrum of second hits to the NF1 locus. Future work will be aimed at a mechanistic understanding of these distinct patterns of mutation and strategies aimed at mitigating tumor risk.

1University of Southern California Keck School of Medicine, Los Angeles, CA

2Children’s Hospital of Los Angeles Saban Research Institute, Los Angeles, CA

Clinical trials use inclusion and exclusion criteria to control for confounding variables in patient populations. Largely inspired by the ASCO-Friends of Cancer Research recommendation documents (2017 and 2021), there has been a recent drive to loosen clinical trial enrollment criteria to improve generalizability in trial outcomes. We sought to determine if the sponsor of a clinical trial impacted the transparency and selection of inclusion and exclusion criteria.

Using clinicaltrials.gov, phase 2 and 3 non-small cell lung cancer (NSCLC) drug trials were sorted into one of three sponsor categories: Industry, government/cooperative group, and academic. Fisher Exact tests were used to assess variability in strictness of specific criteria and level of transparency in listing organ function requirements. Independent sample t tests were used to analyze differences in total number of criteria.

Industry sponsored NSCLC drug trials more often omit from clinicaltrials.gov complete organ function requirements compared to government/cooperative group (p = 2.3 x 10–10, α = 0.01) and academic (p = 1.8 x 10–4, α = 0.01) sponsored trials. Industry sponsored trials are also more likely to have stricter performance status requirements compared to government/cooperative group sponsored studies (p = 5.7 x 10–6, α = 0.01).

Percentage of studies with strict, loose, and no restrictions on performance status. Actual number of studies within each group are included as data labels

Industry funded NSCLC clinical trials are more rigorous in excluding patients with worse performance status and are less transparent in listing all study requirements on clinicaltrials.gov.

University of Washington School of Medicine, Seattle, WA

Informed consent entails that healthcare providers effectively describe adverse effects associated with medical treatments to patients. In radiation oncology, the terms ‘second tumors’, ‘secondary malignancies’, or ‘secondary tumors’ are used in patient consents to describe the appearance of new and different tumors caused by radiation treatment. Furthermore, these incidences are sometimes described in consents as ‘rare’, although the incidence varies greatly from nearly negligible in patients treated with palliative intent, to 20% in patients undergoing myeloablative total body irradiation for stem cell transplant. We evaluated whether non-cancer patients without prior knowledge of or exposure to radiation therapy interpret the terms ‘secondary malignancy’, ‘rare’, and ‘small chance’ in a way consistent with physician intent.

We screened 164 adult subjects who did not require medical interpreters at a university affiliated family medicine clinic, excluding cancer patients and those with any prior knowledge of or experience with radiation treatment. One hundred subjects were eligible for and completed our 12-question multiple choice questionnaire, which assessed their understanding of the term ‘secondary tumor’, and how they would interpret the terms ‘small chance’ or ‘rare’ in the context of a ‘bad side effect’ arising from medical treatment.

Twenty-nine percent of subjects correctly identified that ‘secondary tumors’ referred to new and different tumors caused by treatment. Forty-nine percent thought the term referred to their original tumor coming back, and twenty-two percent thought the term referred to new and different tumors not caused by radiation therapy. In the context of a ‘bad side effect’ occurring ‘rarely’, 2% of subjects attributed ‘rare’ to a 1/10 chance; 16% to a 1/100 chance; 33% to a 1/1000 chance; and 49% to a 1/100,000 chance.

In the context of a ‘bad side effect’ having a ‘small chance’ of occurrence, 8% of subjects attributed ‘small chance’ to odds of 1/10; 33% to 1/100; 41% to 1/1000; and 18% to 1/100,000.

Patients without prior radiation therapy exposure have a demonstrably different understanding than radiation oncologists of the terms ‘secondary malignancy’, ‘second tumor’, or ‘secondary tumor’. Additionally, there is great variability in patient understanding of the terms ‘rare’ or ‘small chance’. Radiation oncologists must use different and more descriptive terms for secondary malignancies and their incidence, to ensure patients are truly informed when undergoing treatment. The results of this study may have implications for all medical fields in which patients are consented for procedures associated with a risk for consequential side effects.

Chemotherapy is a mainstay treatment for late-stage non-small cell lung cancer (NSCLC), yet most tumors develop resistance to these agents. Studies in our lab have shown that chemoresistant NSCLC cells overexpress the muscarinic acetylcholine receptor 1 (CHRM1). We hypothesize that CHRM1 regulates chemoresistance in NSCLC cells, and that the combination of a repurposed CHRM1 antagonist dicyclomine, clinically used to treat IBS, and a chemotherapeutic agent has the potential to sensitize and kill chemoresistant NSCLC cells.

Chemosensitive (A549) and chemoresistant (A549R) NSCLC cells were utilized in this study. Cell survival and colony formation assays were utilized to measure DTX sensitivity by pretreating with designated drug (24 hr) before addition of DTX (48 hr). Western blot and phospho-kinase array were utilized to measure protein expression and intracellular pathway activation. The designed receptors exclusively activated by designer drugs (DREADD) system was utilized to isolate CHRM1 signaling. All data are expressed as the mean ± SEM. Multiple comparisons were analyzed using one-way ANOVA with post-hoc Tukey’s analysis and single comparisons were analyzed using a two-tailed, unpaired Student’s t test.

CHRM1 expression is enhanced in A549R cells, suggesting that CHRM1 may play a role in chemoresistance. This was supported by the ability of a CHRM1 agonist, dicyclomine (Dic), to sensitize A549R cells to the chemotherapeutic agent docetaxel (DTX) measured by cell survival (IC50: DTX, not reached > 1M; DTX + Dic (10μM), 49.91μM; DTX + Dic (25μM), 12.11μM). Furthermore, these results were duplicated by colony formation assay. However, activation of a CHRM1 in A549 cells by the acetylcholine mimetic carbachol did not protect cells from DTX-induced cell death, suggesting that CHRM1 expression is necessary for chemoresistance in the A549R cells, but not sufficient. A phospho-kinase array was used to determine the intracellular signaling pathway activated by CHRM1, which showed increased phosphorylation of multiple kinases including CREB, EGFR, STAT3, and ERK1/2. Increased CREB phosphorylation was validated by western blot with carbachol stimulation in A549R and M1D samples suggesting these as possible targeting pathways downstream of CHRM1.

Chemoresistant NSCLC shows increased CHRM1 expression, which when antagonized, resensitizes these cells to DTX-induced cell death. While CHRM1 expression is not sufficient to instill resistance, it is necessary in the A549R cells, and may play a role in enhancing EGFR signaling. This provides a potential promising new therapy for lethal chemoresistant NSCLC which utilizes the repurposed IBS drug dicyclomine.

1Washington State University, Spokane, WA

2NorthShore University HealthSystem, Evanston, IL

Favorable chemotherapy response score (CRS) has prognostic value and correlates with progression free and overall survival in advanced ovarian cancer. CRS has not been compared to other clinical measures used to gauge response to neoadjuvant chemotherapy (NACT). We sought to examine whether CRS is a better predictor of outcome compared to traditional clinical and radiographic response measures.

Clinical data from 2003–2020 was obtained through retrospective chart review. Radiographic review pre- and post-NACT was performed via RECIST 1.1 with responses characterized as complete/near-complete (CR/NCR), partial with >50% reduction in tumor (PR>50), partial with <50% reduction in tumor (PR<50), stable disease (SD) and progressive disease (PD). Histologic response in surgical specimens was characterized using CRS 1–3. Survival was assessed using the Kaplan-Meier method with log-rank tests, and Cox regression with hazard ratios (HR).

128 patients who underwent NACT for high grade serous ovarian cancer (HGSOC) were included. Increasing CRS was associated with improved recurrence free (RFS) and overall survival (OS). OS at 5 years for CRS 1, 2 and 3 was 24.7%, 57% and 73.7% (p<0.0001). More favorable radiographic response was predictive of decreased recurrence risk with RFS at 3 years for PR<50 and CR/NCR (16.3% and 54.8%, p=0.0005) but not predictive of OS. Patients with CR/NCR more commonly had CRS 3 vs CRS 1 (47.1 vs 17.7%, p=0.022). Among radiographic response groups, increasing CRS was associated with decreased risk of recurrence and death. For example, in patients with CR/NCR risk of recurrence HR 5.38 (p=0.0243) and risk of death 8.24 (p=0.006) with CRS 1 vs 3. Number of NACT cycles prior to surgery did not differ among patients regardless of CRS. Rate of R0 resection was similar among all three CRS subgroups, and for the entire cohort was 85.2%. Recurrence rates were significantly higher with CRS 1 (89.1%) and 2 (73.9%) compared to CRS 3 (38.9%) (p<0.0001). Median CA 125 prior to surgery was lower with CRS 3 compared to CRS 1 (28 vs 81, p=0.0017). Of the 12 germline BRCA2+ patients in the study, 7 (58.3%) had a pathologic CRS 3.

Our data confirms that favorable CRS is associated with improved overall and recurrence free survival in HGSOC. While radiographic response appears predictive of recurrence, it was not associated with overall survival in our study. Among patients with similar radiographic response, CRS remained predictive of outcome and is associated with other clinical factors traditionally felt to confer favorable prognosis. Pathologic CRS is an important predictive factor in determining response to neoadjuvant chemotherapy in HGSOC and may provide the best means to characterize prognosis.

Charles Drew University of Medicine and Science, Los Angeles, CA

The CDC reported cancer patients as at-risk for severe illness from severe acute respiratory syndrome-related coronavirus 2 (COVID-19). Cancer patients were 2 times more likely than non-cancer patients to exhibit cellular sequelae due to COVID-19. Those with hematological malignancies exhibited a case-fatality rate 2 times more than those with solid tumors. This research aims to educate and enhance community understanding of factors that lead to increased mortality rates in COVID-19 cancer patients by using a community training program in SPA 6 of Los Angeles, California.

Data were obtained from post-training surveys of SPA 6 community members which included cancer diagnosis, demographics, and knowledge of COVID-19 with cancer. Impact assessments utilizing Likert-scale response options to analyze and measure the data. Fisher’s exact test was utilized to measure and evaluate participant understanding of the community training program in regards to increased mortality rates of COVID-19 cancer patients. Data analyses were performed using statistical tests SPSS. P-values of <0.05 were considered significant.

There is a significant need for COVID-19 educational training programs for cancer patients in African American and Latino underserved communities. Impact assessments distributed to 100 participants demonstrated positive change in social behavior and willingness to be vaccinated. Post lecture reviews, quizzes, and feedback surveys were distributed to 100 participants. The information received showed a notable change in participants’ overall knowledge of COVID-19 regarding the increased risk in cancer patients.

The data exhibit educational training programs in underserved communities that were hardest hit by COVID-19 increase the understanding of COVID-19 cancer knowledge. The educational training program indicated an association with a greater increase in willingness to participate in COVID-19 prevention practices and willingness to be vaccinated. This research demonstrated the positive change from educational training programs to be utilized to make a significant impact on health outcomes and cancer mortality rates.

1Western University of Health Sciences College of Osteopathic Medicine of the Pacific, Pomona, CA

2University of Kansas Medical Center, Kansas City, KS

To review the predictive value of the minute walk physical function test in hematologic malignancy.

A literature review of PubMed using the terms and synonyms of ‘hematologic cancers’ and ‘functional evaluation’ on June 3, 2021 elicited 1,256 manuscripts. After reviewing each abstract for clinical outcomes in relation to minute walk physical function tests in hematologic malignancy, and with the exclusion criteria of confounding intervention or lack of original research, we included 3 published studies.

Increased frailty before and during cancer treatments has been demonstrated to predict mortality, disability, and hospitalization for cancer patients. The minute walk test is an objective measurement of frailty that measures the distance walked in a set amount of time, with decreased distance walked indicative of increased frailty. While there are multiple published manuscripts documenting the association between the minute walk test and clinical outcomes in cancer patients, few studies validate this test in hematologic cancer patients. Our review found 3 studies using the minute walk test as a functional correlate for rates of mortality. Only 1 of the reviewed manuscripts reported significant increase in mortality with decreased physical function measured by the minute walk test, while the other 2 studies showed no significant change. The study that showed a significant change used a follow up period of 2 months, while the studies with nonsignificant results used a 1 to 2 year follow up.

The frequency of nonsignificant results and the shorter follow up period of the significant results suggest that the minute walk test may be an unreliable predictor of mortality in hematologic cancers. This affects oncology and physiatry alike. The minute walk is one of multiple frailty assessments that oncologists use to determine the intensity and type of treatment a patient should receive. This would also impact physiatry, as there is a growing practice of ‘pre-rehabilitation’, of improving physical function before and during cancer treatment to improve clinical outcomes. If the minute walk test is an inaccurate predictor of mortality, then pre-rehabilitation may focus less on walking mobility. It is possible that the minute walk test may be an accurate predictor of other outcomes in this patient population, such as patient satisfaction or unplanned hospitalizations. Further research, including a meta-analysis, is necessary to determine the predictive value of the minute walking test in hematologic malignancies. As more rehabilitation and oncology practices embrace pre-rehabilitation, the need for validated and standardized methods of objectively assessing physical mobility increases.

1University of California Los Angeles, Los Angeles, CA

2University of California San Francisco, San Francisco, CA

3University of California San Diego, La Jolla, CA

4University of California Davis, Davis, CA

5University of California Irvine, Irvine, CA

55% percent of infants with gastroschisis in the University of California Fetal Consortium (UCFC) have growth failure (GF). The etiology of GF is multifactorial and associated with caloric/nutrient deficiencies. Intestinal dysbiosis may play a role. In this prospective study of infants with gastroschisis, we aimed to investigate 1) if a nutritional pathway would decrease GF, and 2) the relationship between the microbiome and GF.

The UCFC implemented a pathway to decrease GF by standardizing parenteral nutrition dosing, human milk feedings, and GF detection and treatment. Adherence was monitored, and a contemporary cohort (n=45) was compared to a historical cohort (2015–2019, n=125). GF was defined as a decline in weight or length z-score ≥0.8. Shotgun next generation sequencing of the fecal microbiome was performed in a subset of gastroschisis (n=7) and late preterm infants (n=7).

Good adherence to the pathway was noted. Demographics were similar for the cohorts except birth weight (table 1). Historical controls exhibited a decline in weight and length z-scores at 30 days (-0.10 z-score units/week and -0.11 z-score units/week, respectively, p<0.001 for all). In the prospective cohort, weight and length z-scores remained stable. When the cohorts were compared, the prospective cohort demonstrated a decrease in length GF at 14 days (p=0.002), 30 days (p=0.03), and discharge (p=0.002). However, weight GF was similar at all time points. When compared to preterm infants, gastroschisis infants had a higher abundance of Bacteroides thetaiotaomicon (q=0.003), Bacillus coagulans (q=0.061), Lactobacillus animalis (q=0.13), and Akkermansia municphilla(q=0.13), and less Bifidobacterium bifidum (q=0.19) and longum (q=0.19) even after adjustment for delivery mode and antibiotic days (figure 1).

A) Heat map showing associations of bacteria with gastroschisis, delivery mode, and antibiotics; B) Dot plots showing relative abundances (log10 transformed) of species

This study suggests that a multi-institutional nutritional pathway is feasible and may decrease linear GF in infants with gastroschisis. Research is needed to determine how the microbiome contributes to GF in this population.

1University of California Los Angeles David Geffen School of Medicine, Los Angeles, CA

2University of California Los Angeles, Los Angeles, CA

Worldwide, 20,000 infants each year are legally blind from retinopathy of prematurity (ROP). We have demonstrated that preterm infants develop docosahexaenoic (DHA) and arachidonic acid (ARA) deficits after birth. These polyunsaturated fatty acids play an important role in regulating inflammation and angiogenesis. The aim of this research is to investigate DHA and ARA status in infants at risk for ROP.

Inclusion criteria for this single site retrospective study: ≤ 30 weeks gestational age (GA) or ≤ birthweight (BW) < 1.5 kg, and ROP screenings until ROP development, complete vascularization, or 42 weeks postmenstrual age. DHA and ARA in the red blood cell membrane were quantified with gas chromatography-mass spectrometry. DHA, ARA, and ARA:DHA were compared throughout the first month of life, stratified by either severity of (Type 1 ROP, low grade ROP, no ROP) or treatment for ROP.

Summary of Results table 1 depicts subject demographics. At week 1, ARA was lower in the Type 1 ROP group vs. the no ROP group (17.9±2.2% vs. 20.5±1.7%, p<0.01). At week 2, significant differences were noted in DHA and ARA (figure 1) but not ARA:DHA. No significant differences in DHA, ARA, and ARA:DHA were observed in weeks 3–4.

This study demonstrates that preterm infants with more severe ROP, either Type 1 or ROP requiring treatment, have lower ARA and DHA levels than infants without ROP. It remains unclear if DHA and ARA supplementation shortly after birth will improve ROP outcomes.

University of California Los Angeles David Geffen School of Medicine, Los Angeles, CA

Intravenous lipid emulsions (ILEs) are an important component of parenteral nutrition (PN) for neonates with gastrointestinal disorders (GD). Neonates with GD are at high risk for parenteral nutrition associated cholestasis (PNAC) and associated complications, including liver failure. 100% soybean oil (SO) contains a high concentration of hepatotoxic phytosterols and omega-6 fatty acids, which contribute to PNAC. A composite oil (CO) containing 15% fish oil has high amounts of a-tocopherol and omega-3 fatty acids, and less phytosterols. This study aims to compare PNAC and clinical outcomes in infants with GD who received SO or CO.

Inclusion criteria for this observational study included: 1) born between 2014 and 2019, 2) GD (gastroschisis, omphalocele, intestinal atresia, motility disorder, volvulus, necrotizing enterocolitis, or intestinal perforation), 3) exposure to SO or CO >7 days, and 4) survival to discharge. The primary outcome was cholestasis (conjugated bilirubin (CB) >1 mg/dL). Gas chromatography/mass spectrometry was used to measure fatty acids in the red blood cell membrane in a subset of infants.

The mean (±SD) gestational age was 37±3 and 36±3 weeks for the SO (n=29) and CO (n=21) groups, respectively (p=0.47). The two groups were well matched for GD diagnosis (p=0.5) and number of GI surgeries (1.8±0.8 for both groups, p=0.90). Nutrition delivery was similar for the SO and CO groups, including days to full enteral feeds (33±32 vs. 30±25 days, p=0.85) and ILE days (25±21 vs. 30±27 days, p=0.77). Weight z-score declined from birth to discharge (-1.0±0.9 vs. -0.8±1.0, p<0.01 for both), but there was no difference between groups (p=0.52). There was no difference in PNAC incidence (48% vs 48%, p=0.99) and maximum CB (2.0±1.8 vs. 1.9±1.6 mg/dL, p=0.79) when the SO group was compared to the CO group (figure 1). Fatty acid profiles were similar between the two groups.

In this study of infants with GD, when compared to infants who received SO, infants who received CO had similar fatty acid trajectories, growth, and clinical outcomes, including PNAC. Further investigation is needed to determine the optimal ILE to decrease PNAC incidence in this population.

1University of California San Francisco, San Francisco, CA

About 4–8% of pregnant women are treated with selective serotonin reuptake inhibitors (SSRI). SSRI exposure in the third trimester may cause poor neonatal adaptation and abnormal movement in neonates, both potential signs of encephalopathy. We assessed whether exposure to SSRI during the third trimester of pregnancy, and dose of SSRI, are associated with neonatal encephalopathy (NE).

In a cohort study comprising all Kaiser Permanente Northern California births ≥ 35 weeks from 2011 to 2019, we defined NE as 5-minute APGAR score <7 and abnormal level of consciousness, activity, tone, or reflexes. We used logistic regression to adjust for potential confounders.

Of 305,426 infants, 8,024 (2.6%) were exposed to SSRI in the third trimester, and 510 (0.17%) had NE. After adjusting for maternal depression or anxiety, maternal age, race, and hospital, exposed neonates had 2.7 times higher odds of NE (95% CI 1.9–3.8). The average risk difference between SSRI-exposed and unexposed mothers was 2.7/1000 (95%CI 1.8–4.1/1000). This relationship was dose-dependent. Each 25mg/d increase in the sertraline equivalent dose was associated with a 31% (95% CI: 23–39%) increase in the odds of developing NE.

Neurologic outcomes of neonates exposed and unexposed to SSRI

Exposure to SSRI in the third trimester is associated with increased risk of neonatal encephalopathy. Given that NE is rare, if this association is causal the number needed to cause is high (N~370) and should be balanced with the potential maternal and neonatal benefits of treatment. Future directions include EEG and MRI analyses to correlate SSRI exposure with severity of NE and brain abnormalities.

University of Utah Health, Salt Lake City, UT

Breastfeeding is a well-established non-pharmacological way to improve severity of NOWS in infants with in-utero drug exposure. Given the barriers imposed on mothers during their inpatient stay in the COVID-19 pandemic, the purpose of this study was to assess breastfeeding rate among infants with NOWS during that time. This study also compared the rate of breastfeeding at discharge and the trends in substance use to our published cohort at the University of Utah prior to the COVID-19 pandemic.1

This was a retrospective chart review of a single academic center at University of Utah. Infants born at ≥34 weeks gestational age, between January 1-December 31, 2020, who received Neonatal Withdrawal Inventory (NWI) scoring were reviewed. Infants who received NWI for non-intrauterine drug exposure were excluded. We calculated the percentages of breastfeeding rates of eligibility, initiation, and continuation at discharge. Eligibility for breastfeeding was determined by the provider permitting such use. We additionally noted infant and maternal demographic data, modes of delivery, and drug exposures per cord toxicology screens.

Of the 125 infants reviewed, 102 infants met eligibility. Table 1 summarizes the data. Mothers of 77% infants received medication-assisted therapy (MAT) compared to only 61% in our prior study. Similar to our prior study, 21% infants had isolated opioid exposure compared to 79% with polysubstance exposure which included opioid and non-opioid substances. Sixty-five (64%) of the infants were deemed eligible to breastfeed or to receive expressed maternal milk. Fifty-seven (56% of total, 88% of breastfeeding eligible) infants received maternal milk at least once during hospitalization. However, only 37 (36% of total, 57% of breastfeeding eligible) infants were receiving maternal milk at discharge compared to 48% in our prior study. 18 (18%) infants were discharged to adoptive family or state custody and three of them were eligible to receive maternal milk but did not due to social limitations.

Despite a higher rate of maternal MAT with no change in the substance exposure rates, infants with NOWS during COVID-19 suffered from the loss of benefits of breastfeeding/breastmilk feeding. The provision of maternal milk when medically safe in infants with NOWS is vital to optimizing short- and long-term outcomes. However, in this population of vulnerable mother-infant dyads, establishing and sustaining breastfeeding remains a complex challenge particularly during the COVID-19 pandemic when additional psychosocial factors and unanticipated barriers may dominate.

Morris E, et al.Am J Perinatol 2020.

1University of Southern California, Los Angeles, CA

2University of Maryland School of Medicine, Baltimore, MD

Greater use of mother’s milk (MM) is associated with improved outcomes for preterm infants admitted to neonatal intensive care units (NICUs). Healthcare disparities exist in MM provision to preterm infants and further research is needed to better identify barriers to providing MM in high-risk populations. Our urban Level IV NICU serves a patient population who are 60% non-Hispanic Black, allowing us to better study this important demographic. The study objective was to evaluate incidence and predictors of provision of MM to early preterm non-Hispanic Black infants in Baltimore, Maryland.

We performed a retrospective medical record review of non-Hispanic Black infants (as identified by their mother) born <34 weeks gestational age (GA), between 9/2014 – 12/2020 in an urban Level IV NICU. We performed bivariate analyses comparing: 1) maternal and neonatal characteristics of infants who received MM at any point during NICU admission vs. those who did not, and 2) neonatal outcomes based on exposure to any MM vs. none.

We identified 422 early preterm, non-Hispanic Black infants during the study period, of whom 332 (79%) received some MM during their NICU admission. Maternal factors associated with receiving no MM during admission included higher maternal gravidity (p=0.0011), increased parity of term deliveries (p<.0001) and mothers with increased number of living children (p<.0001). Maternal age and medical comorbidities such as pre-eclampsia, chronic hypertension, and diabetes did not have a significant impact on provision of MM. Infants of mothers with bipolar disorder were less likely to receive MM (p=0.0068) while those of mothers with anxiety were more likely to receive MM (p=0.0245). There was no difference in MM provision for those whose mothers had pre-existing depression or who screened positive for postpartum depression. Mothers of infants who did receive MM were significantly more likely to have had documented lactation consultation during admission (74% vs. 20%, p<0.001). Infants who received no MM had higher birth weights (p<0.0001), were born less prematurely (p=0.0002), and were more likely to have been on a ventilator (p=0.0219) during their admission, though there was no difference in rates of intraventricular hemorrhages.

Identifying barriers to MM provision for non-Hispanic Black infants will allow clinicians to focus supportive and educational interventions. Interestingly, although medical comorbidities such as diabetes, hypertension, and depression did not lower likelihood of providing MM, having more living children did decrease incidence of MM provision. Inpatient lactation consultation had one of the strongest associations, so enhancing access to lactation consultation may significantly increase MM provision in early preterm neonates.

1The University of Arizona College of Medicine Phoenix, Phoenix, AZ

2Division of Neonatology, Phoenix Children’s Hospital, Phoenix, AZ

Optimal nutrition is essential to overcome common disease processes in preterm and high-risk term newborns; however, introduction of enteral feedings creates a possible risk of developing necrotizing enterocolitis (NEC). NEC is a potentially devastating inflammatory disease of the gastrointestinal tract, which can result in intestinal perforation and possibly death. Simple interventions such as prioritizing human milk over formula feeds and following a standardized feeding protocol for initiating and advancing feeds are well established practices for improving outcomes and reducing NEC. Phoenix Children’s Hospital Division of Neonatology was established in 2020, providing medical services to a level 4 NICU and to two level 2 NICUs. We introduced a standardized feeding protocol and used quality improvement methodology to measure compliance with our non-surgical infants < 37 weeks, with a goal to increase compliance by > 10% and measured human milk use during hospitalization and at discharge in infants of all gestational ages with a goal to increase human milk consumption by >10%.

We collected data on our feeding practices at all 3 NICUs to measure compliance with the feeding protocol from December 2020 to July 2021. Outcomes were compared in 2 distinct epochs: Epoch 1 from December 2020 to March 2021 and Epoch 2 from April to July 2021. To increase compliance with our feeding protocol, we educated providers, and nurses about the protocol upon its roll-out. Awareness was increased by posting copies at medical provider work stations (February 2021), reviewing interim compliance data with the medical team (April 2021) and placing copies in bedside charts (May 2021).

Feeding data was tracked on 265 infants. The mean gestational age and birth weight were 36 weeks (± 3 weeks) and 2700g (± 100g). In both epochs, breast milk was used for the initial feed in 58% of all babies admitted to the NICU. The mean time to full feeds was 4 days ± 2 days in preterm infants < 34 weeks. Compliance with protocol improved with time from 72% in epoch 1 to 77% in epoch 2 in babies <37 weeks, and from 66% to 75% in babies < 34 weeks. Babies discharging home exclusively on breast milk increased from 16.9% in epoch 1 to 43.5% in epoch 2. There was 1 case of medical NEC in both epochs and no cases of surgical NEC.

In this quality improvement project, we improved compliance with a feeding protocol and increased exclusive human milk usage through hospital discharge. While there was 1 case of medical NEC in both epochs, there were no cases of surgical NEC in our data set.

University of Southern California, Los Angeles, CA

The AAP recommends use of expressed breast milk (EBM) or donor human milk (DHM) in preterm infants fortified with proteins, minerals, and vitamins to ensure optimum nutrient intake. Unfortunately, the implementation of EBM/DHM fortified with human milk-based fortifiers (EHM) can place an economic burden on individual institutions raising concerns on the economic feasibility of such products. The objective of this study is to assess the clinical impact of using EHM in very low birth weight infants (VLBW) infants, and to perform a cost-benefit analysis of its use.

Retrospective study of all VLBW infants admitted to neonatal intensive unit before and after the implementation on the use of EHM. Neonatal demographics and clinical outcomes such as necrotizing enterocolitis (NEC), severe retinopathy of prematurity (ROP), bronchopulmonary dysplasia (BPD), late-onset sepsis (LOS), and average length-of-stay (ALOS) were collected from January – December 2016 (before implementation) and January – December 2020 (after implementation). The net cost to the institution was estimated using published data for each outcome measure.

After excluding deceased infants in both time periods, 45 infants were included in the pre-EHM analysis period, (mean birth weight (BW): 1034 g, mean gestational age (GA): 27.9 weeks), and 27 infants were included in the post-EHM analysis period (mean BW: 1070 g, mean GA: 28.8 weeks). Our institution’s product acquisition cost in 2020 was estimated to be $313,784. The implementation of the EHM protocol saw a reduction in the ALOS by 12.9 days and average total parenteral nutrition (TPN) use by 7 days per infant in the post-EHM group equating to a net savings of $1,176,670. While there was a small difference in the number of morbidities between the two time periods, when combining the cost avoidance to include medical NEC, and BPD, the estimated financial impact excluding insurance reimbursement rose to $1,331,130 (table 1).

Our preliminary findings suggest that implementation of exclusive human-milk feeding in VLBW infants is a cost-effective option for NICUs that can result in decrease in NEC, BPD, TPN use and length of stay for these infants at nominal cost.

The Lundquist Institute, Torrance, CA

An exponential increase in the use of electronic cigarettes (e-cig), including by pregnant women, exposes an increasing number of fetuses to potentially harmful e-cig chemicals with little knowledge of its repercussions. Perinatal nicotine exposure-induced asthma is associated with downregulated PPARγ signaling and upregulated Wnt signaling in the developing lung. However, the impact of maternal nicotine vaping on the developing lung is unknown. Here, we use an established rat model to determine the effect of perinatal maternal e-cig vaping on offspring pulmonary function and markers of airway contractility.

Pair-fed pregnant rat dams received saline, vehicle (e-cig without nicotine), or e-cig with nicotine daily from embryonic day 6 until postnatal day (PND) 21. Using an established e-cig delivery system and mimicking real-life puffing topography, dams were exposed to four-sec puffs, one puff (puff volume 35 ml) every 30s, 3h/day, and 7 days/week. Average maternal plasma nicotine level (7±4 ng/ml) using this vaping regimen is well within the range observed in moderate cig smokers. Pups delivered spontaneously at term and breastfed ad-lib, but not directly exposed to e-cig aerosols at any time. At PND21, lung resistance and compliance were determined following the methacholine challenge. At sacrifice, the lungs were collected to determine the expression of airway contractility markers, i.e., α-SMA, Calponin, Fibronectin, Collagen I/III, and key Wnt/PPARγ signaling intermediates by qRT-PCR, immunoblotting, and immunostaining.

Compared to controls, perinatal e-cig exposure resulted in a significant increase in airway resistance and decreased airway compliance following the methacholine challenge. mRNA levels of Collagen III and LEF-1 increased, and those of PPARγ and ADRP decreased in the e-cig group. Immunoblotting showed that in the e-cig group, airway contractility markers (α-SMA, Calponin, Fibronectin, Collagen I and III), Wnt signaling intermediates (β-catenin and LEF-1), and nicotinic acetylcholine receptors α3 and α7 levels increased. In contrast, compared to controls, PPARγ, which interacts directly with Wnt signaling intermediates, levels decreased. Immunostaining of whole lung sections confirmed immunoblotting data.

For the first time, we unequivocally demonstrate offspring asthma following perinatal maternal e-cig vaping and explain likely molecular mechanisms involved. Our data add to the accumulating evidence contradicting the idea that e-cigs are’safe.’

NIH (HL151769, HD127237, HD071731, and HL152915)) and TRDRP (23RT-0018, 27IP-0050, and T29IR0737).

University of Utah Health, Salt Lake City, UT

Preterm infants frequently suffer growth restriction, increasing the risk and severity of neonatal lung disease, characterized by impaired alveolar development and worse outcomes in male infants. We showed that growth restriction in the prenatal (IUGR) or postnatal (PGR) period reduces rat lung PPARγ gene expression, which results in impaired alveolar development. PPARγ variants, including the novel delta 5 splice variant (PPARγΔ5), can impact the downstream effects of PPARγ activation. As PPARγΔ5 is a dominant negative variant, the effect of increasing PPARγΔ5 is a reduction in PPARγ signaling. Whether PPARγΔ5 is expressed in the rat lung, and the effect by growth restriction on expression is unknown. We hypothesize that PPARγΔ5 will be expressed in rat lung, and that the combination of IUGR and PGR will increase expression of PPARγΔ5.

IUGR and PGR were generated in Sprague Dawley rat pups by bilateral uterine artery ligation and variation in litter size respectively. Lungs were collected at postnatal day 12 from Control, IUGR only, PGR only, and PGR+IUGR rat pups. Male and female rats were treated as sperate groups. PCR, gel electrophoresis, and sequencing were used to confirm the presence of PPARγΔ5 in the rat lung. Full length and PPARγΔ5 mRNA and protein were assessed using real-time RT PCR and western blotting. Differences were assessed by one-way ANOVA and fishers post hoc test.

Results are IUGR as% of control±SD, *P<.05. Sequence confirmed PPARγΔ5 mRNA is expressed in rat lung at postnatal day 12. PGR model resulted in significantly lower weights on D12 (66.9±3% for PGR only, 64.8±3.2% in IUGR+PGR). In male rat lung, PPARγΔ5 mRNA was increased (325±79%*) by IUGR+PGR. Similarly, in the male rat lung, PPARγΔ5 protein was increased by IUGR+PGR (163±14%*). In female rat lung, neither PPARγ transcript was affected. However, lung protein levels of PPARγΔ5 were increased in female IUGR+PGR (146±24%*).

We conclude that the PPARγΔ5 is expressed in rat lung, and that IUGR+PGR increase expression. We speculate that increased PPARγΔ5 expression in male IUGR+PGR rat lungs may further impair PPARγΔ5 signaling, leading to impaired alveolar development.

The Lundquist Institute, Torrance, CA

Nicotine exposure to the developing fetus results in asthma that can be transmitted across generations. However, the underlying mechanism remains unknown. We recently demonstrated differential DNA methylation in the proximity of nicotine-response genes in sperms of the perinatally nicotine exposed F1 animals. Gene ontology and pathway enrichment analysis suggested a possible link between the spermatozoal differential DNA methylation and the offspring asthma phenotype. We hypothesize that nicotine-induced spermatozoal epigenetic changes drive the intergenerational transmission of nicotine-induced asthma. The expression of genes in F2 lungs differentially methylated in the spermatozoa of nicotine exposed F1 males was determined to test this hypothesis.

Sprague Dawley rat dams (F0) received nicotine (1 mg/kg, sc) or saline from embryonic day 6 (E6) until postnatal day 21 (PND21). Pups (F1) were weaned at PND21 and used as breeders to generate F2 without any subsequent exposure to nicotine in the F1 progeny. F2 pups were weaned at PND21. At PND60, F2 males (n=20; 10 control, 10 nicotine) were sacrificed, and their lungs were collected and flash frozen for performing qRT-PCR for the top 11 differentially methylated genes (AABR07051515.1, Dio1, Gabra4, Htr6, Map4k2, Men1, Mnu, Orai2, Rars, Sec1415, and Slc7a11) in sperm cells of the nicotine exposed F1 males.

In line with data from F1 lungs, the expression of the top 2 differentially hypermethylated genes AABR07051515.1, a lincRNA, known to modulate lung function, and Dio1 (iodothyronine deiodinase 1) in the nicotine exposed F1 sperm cells was upregulated or downregulated, respectively (p ≤ 0.05), in F2 lungs. In addition, similar to the F1 progeny, Mnu and Sec1415 genes’ expression was downregulated (p ≤ 0.05) in F2 lungs of the nicotine-exposed group. In contrast, the expression of the other 7 differentially methylated genes in F1 spermatozoa did not change significantly.

Our data further support the concept that perinatal nicotine exposure-induced spermatozoal epigenetic reprogramming, specifically DNA methylation alterations in nicotine response- and lung development-related genes, likely drive the intergenerational transmission of perinatal nicotine-induced asthma.

NIH (HL151769, HD127237, HD071731, and HL152915) and TRDRP (23RT-0018; 27IP-0050; and T29IR0737).

University of Colorado – Anschutz Medical Campus, Aurora, CO

Pulmonary hypertension (PH) associated with bronchopulmonary dysplasia (BPD) leads to worse outcomes in former preterm neonates. Serotonin (5-hydroxytryptamine, 5-HT) is a potent pulmonary vasoconstrictor, smooth muscle mitogen, and is increased in the lungs of infants who died with severe BDP. Tryptophan hydroxylase 1 (TPH1), the rate limiting enzyme in 5-HT synthesis, is increased in adult patients and animals with experimental PH. Serotonin signaling blockade decreases pulmonary vascular resistance and prevents pulmonary vascular remodeling in preclinical models. We hypothesized that TPH1 knock-out (KO) neonatal mice would be protected from hypoxia induced BPD associated with PH.

Neonatal wild-type (WT) and TPH1 KO offspring were placed in hypoxia or remained in normoxia at Denver altitude for 2 weeks. To assess alveolar development, inflation fixed lungs were analyzed for surface area (SA) and mean linear intercept (MLI). To identify total number of small vessels (<30 μm), lung sections were immunostained with Factor VIII. PH was assessed by Fulton’s index and right ventricular systolic pressures (RVSP). Platelet poor plasma (PPP), platelet, and lung homogenate 5-HT levels were measured by ELISA. Data were analyzed by Prism with unpaired t-test or 2-way ANOVA with Bonferroni post-hoc analysis. Significance level p<0.05.

At baseline, WT mice have more platelet poor plasma, platelet, and lung 5-HT than KO mice (53±6 — 9±1, p<0.0001; 275±11 — 56±12, p<0.0001; 18±2 — 10±1, p<0.004; respectively, ng/mL). TPH1 KO mice were not protected from hypoxia-induced alveolar simplification, shown by no difference compared to WT mice MLI and SA, nor were they protected against hypoxia-induced pulmonary vascular simplification, shown by no difference compared to WT mice vessel density. TPH1 KO mice were attenuated to hypoxia-induced pulmonary vasoconstriction, shown by reduction in RVSP (32±0.66 — 29±0.55, p<0.006, mmHg). There was less PPP and platelet 5-HT in hypoxia-exposed WT mice compared to WT mice at baseline (20±2 — 53±6, p<0.0001 and 117±27 — 275±11, p<0.0001, respectively, ng/mL). There was less lung 5-HT in hypoxia-exposed KO mice than in KO mice at baseline (2±1 — 10±1, p<0.001, ng/mL).

Neonatal TPH1 KO mice are not protected against hypoxia-induced lung injury. Surprisingly, this study contradicts the current understanding of the role of 5-HT in adults with PH and in adult models of hypoxia-induced PH. We found decreased plasma and platelet 5-HT following hypoxia exposure. We speculate that decreased 5-HT observed in hypoxia may contribute to neonatal hypoxia-induced alveolar simplification and impaired vascular development. Further studies are needed to elucidate the role of 5-HT in the developing lung.

University of Colorado – Anschutz Medical Campus, Aurora, CO

Pulmonary hypertension (PH) associated with bronchopulmonary dysplasia (BPD) leads to worse outcomes in former preterm neonates. Elevated platelets at birth are an independent predictor of BPD, increased platelet derived protein after birth is associated with higher rates of neonatal pulmonary vascular disease, and perinatal platelet transfusions are associated with higher rates of mortality and BPD. Circulating platelets from neonatal mice with experimental PH are increased and express a higher percentage of active αIIbβ3, a marker of platelet activation. NBEAL 2 knock-out (KO) mice lack platelet alpha granules, have low platelet counts, and have decreased platelet function in vitro and in vivo. We hypothesized that NBEAL 2 KO neonatal mice would be protected from hypoxia-induced PH.

Neonatal wild-type (WT) and NBEAL 2 KO offspring were placed in hypobaric hypoxia (18,000 feet) or remained in normoxia at Denver altitude for 2 weeks. PH was assessed by Fulton’s index as a marker of right ventricular hypertrophy (RVH) and right ventricular systolic pressures (RVSP). Data were analyzed by Prism with unpaired t-test or 2-way ANOVA with Bonferroni post-hoc analysis. Significance level p<0.05.

Right ventricular systolic pressure is higher in NBEAL2 KO mice than in WT mice at baseline (24.2±0.5 — 21.5±0.3, p<0.0001, mmHg). There is no difference between baseline right ventricular hypertrophy between NBEAL2 KO mice and WT mice (0.27±0.01 —0.31± 0.01, ns). NBEAL2 KO mice display comparable hypoxia-induced increase in RVSP compared to WT mice (29±1 — 30±1, ns, mmHg). NBEAL2 KO mice display comparable hypoxia-induced increase in RVH compared to WT mice (0.3± 0.03 — 0.4±0.03, ns).

Platelet alpha granule deficiency is a risk factor for neonatal pulmonary vasoconstriction at baseline. Further studies are needed to elucidate the role of platelets in neonatal PH associated with BPD.

1University of California San Diego, La Jolla, CA

2SickKids Research Institute, Toronto, ON, Canada

3Johns Hopkins University, Baltimore, MD

4Sanford Burnham Prebys Medical Discovery Institute, La Jolla, CA

Metabolism is vital to cellular function and tissue homeostasis during human lung development. In utero, embryonic stem cells undergo endodermal differentiation towards a lung progenitor cell (LPC) fate that can be modeled in vitro using pluripotent stem cells (hPSCs). We previously showed differences in lung cell composition and gene expression between wild type and surfactant protein B (SP-B) deficient lung organoids. These differences may be impacted by changes in metabolites during early lung development. We hypothesize that SP-B deficient cells will express a different metabolomic profile compared to wt cells during the differentiation to lung progenitor cells.

To examine metabolites that differ during endodermal differentiation, we used an untargeted metabolomics approach to evaluate the changes in metabolites at the stem cell (hPSC), definitive endoderm (DE), anterior foregut endoderm (AFE) and lung progenitor (LPC) stage between wt and SP-B deficient cell lines. At each differentiation step, the cells were sorted for surface markers specific to their differentiation stage in quadruplicate. The homogeneous cell lysates were analyzed using a Biocrates p180 metabolite kit including hexoses, amino acids, phosphatidylcholines, lysophosphatidylcholines, sphingolipids, acylcarnitines, and biogenic amines. The metabolomic multivariate data analysis was performed using XLSTAT.2016 software (Addinsoft) and MetaboAnalyst.

We found that the largest metabolic changes during endodermal differentiation occurred from hPSC to DE with a change from glycolytic respiration to oxidative phosphorylation. The metabolites most enriched during the differentiation from hPSC to LPC, independent of cell line, were sphingomyelin and lecithin. In the wt cell lines, metabolites for oxidation of fatty acids and tryptophan metabolism were up-regulated, while metabolites for ammonia recycling and aspartate metabolism were down-regulated. In the SP-B deficient cells, metabolites in fatty acid oxidation and carnitine synthesis were up-regulated and metabolites for amino acid metabolism, the urea cycle, and multiple energy-based pathways were down-regulated.

Differentiation to lung progenitor cells from pluripotent stem cells resulted in increased fatty acid metabolism and decreased urea cycle and aspartate metabolism in both wt and SP-B deficient cell lines. Therefore, metabolite composition in early lung development is not influenced by the loss of SP-B expression.

1University of Colorado – Anschutz Medical Campus, Aurora, CO

2The University of Texas at Austin College of Natural Sciences, Austin, TX

Maternal vitamin D deficiency (M-VDD) is associated with perinatal pulmonary morbidities. We have demonstrated that offspring of rodent maternal VDD dams have sustained abnormalities of distal lung structure, increased airway hyperreactivity and abnormal lung mechanics. In pulmonary endothelial cells, vitamin A (VA) and vitamin D (VD) co-dimerize on retinoid x receptor. VA therapy has been shown to improve lung development in pre-clinical and clinical studies, but whether combined postnatal (PN) treatment with VA and VD further enhances lung development in offspring of M-VDD dams is unknown. Therefore, we seek to determine if PN VA and VD supplementation improves lung development and function in offspring of M-VDD dams.

Newborn rats from control (CTL) and M-VDD dams received daily treatment of retinoic acid (VA) alone, VA and 1,25-OHD (VD) (VA-VD) or saline (SAL) for 14 days. On DOL 14 lung structure was assessed by mean linear intercept (MLI), radial alveolar count (RAC) and pulmonary vessel density (PVD). Lung mechanics were measured using flexiVent.

Lungs from VDD-SAL rats had increased MLI (p<0.001) and decreased pulmonary vessel density (p<0.05) as compared to CTL-SAL. VDD rats that received VA had increased RAC compared to VDD-SAL (p<0.05). VDD-SAL rats had increased resistance (p<0.01) and decreased compliance (p<0.01) as compared to CTL-SAL. VDD-VA rats had decreased elastance as compared to VDD-SAL pups (p<0.05).

M-VDD decreases distal lung and vascular development and impairs lung function in infant rats. PN VA therapy improved RAC and decreased elastance in VDD pups. These findings suggest that abnormal lung development after PN VA therapy may improve alveolarization and lung mechanics of M-VDD pups. We speculate that M-VDD leads to persistent abnormalities in infant lung growth that may be responsive to PN VA.

1University of Colorado – Anschutz Medical Campus, Aurora, CO

2The University of Texas at Austin College of Natural Sciences, Austin, TX

Vitamin D deficiency (VDD) during pregnancy is associated with chronic lung disease in preterm infants, and the underlying mechanisms are not understood. We have shown that vitamin D (VD) preserves lung structure and prevents pulmonary hypertension (PH) in an experimental model of bronchopulmonary dysplasia, and that VD treatment increases pulmonary artery endothelial cell growth and function. However, the direct effects of maternal VDD on pulmonary endothelial cell (PEC) growth and function are unknown. Thus, we seek to determine whether PEC from newborn rats exhibit altered growth and mRNA expression at birth after exposure to maternal VDD and whether these changes persist during infancy.

Female rats were fed VDD chow and shielded from UV-B light to achieve 25-OHD levels less than 10 ng/ml before mating. PEC were isolated from offspring of maternal VDD (VDD) or control (CTL) dams at postnatal days 0 and 14. PECs were used for proliferation assays and response to exogenous VEGF and 1,25-OHD. PEC lysates were also collected for RT-qPCR analysis.

PEC isolated from VDD pups at both D0 and D14 demonstrate decreased growth compared to CTL D0 and D14 (p<0.01). VEGF or 1,25-OHD treatment increased CTL PEC growth from both D0 and D14 when compared to untreated CTL D0 and D14 PEC (p<0.01). In contrast, neither VEGF nor 1,25-OH treatment increased D0 VDD PEC growth. D14 VDD PEC showed an increased growth with VEGF treatment compared to untreated D14 VDD PEC (p<0.01). RNA isolated from D0 VDD PEC demonstrate decreased expression of KDR and eNOS and increased VEGF expression compared to D0 CTL PEC (p<0.01), no expression changes seen at D14.

We found that D0 PEC from newborn offspring of maternal VDD dams demonstrate decreased baseline PEC growth and no responsiveness to angiogenic stimuli. At D14 VDD PEC grew poorly at baseline, and were responsive to VEGF but not 1,25-OHD treatment. We speculate that maternal VDD disrupts normal PEC function, which persists into postnatal life and may contribute to high risk for late cardiopulmonary disease.

The Lundquist Institute, Torrance, CA

Dysregulated peripheral circadian rhythm is associated with enhanced inflammatory response and cellular senescence. Recent studies have demonstrated an association of exposure to cigarette smoke and dysregulated peripheral molecular clock in Chronic Obstructive Pulmonary Disease (COPD) and asthma patients. This has also been confirmed in rodent models. Although developmental smoke/nicotine exposure predisposes to asthma and COPD, its impact on circadian clock genes is unknown. Here, we test the hypothesis that developmental nicotine exposure alters the molecular clock, which lasts well into adult life.

Pair-fed pregnant Sprague-Dawley rat dams received once-daily 1mg/kg nicotine or saline diluent from embryonic day 6 (E6) to postnatal day 21 (PND21). Lungs from pups were collected on E21, PND21, or PND60 and flash-frozen for later mRNA and protein analysis. The expression of core clock genes (Bmal1, Clock, Cry1, Cry2, Per1, Per2, Rev-erba, Rev-erbb, Rora, and Sirt1) was determined by qRT-PCR on mRNA isolated from lungs. Protein levels of key clock genes Bmal1, Clock, and Rev-erba were determined using western analysis on proteins extracted from PND 21 lungs.

Overall, the mRNA expression of Bmal1, Clock, Cry1, Cry2, Per1, Per2 , Rev-erba, Rev-erbb, Rora, and Sirt1 was significantly decreased (p<0.05) in the nicotine treated group vs. the control group at E21 and PND21. Perinatal nicotine exposure-induced downregulation of key clock genes Bmal1 and Rev-erba was also confirmed by their down-regulated protein levels by western analysis at PND 21. Interestingly, the expression of several down-regulated clock genes at E21 and PND21 in the nicotine-treated group was either not different or upregulated versus the control group at PND60, suggesting a dynamic response of perinatal nicotine exposure on the peripheral molecular clock.

Perinatal nicotine exposure leads to peripheral clock dysregulation in the lung that lasts at least through adolescence. These results suggest a new mechanism that underlies the effects of perinatal nicotine-induced lung injury. Further studies are needed to determine the impact of perinatal nicotine exposure-induced dysregulated peripheral clock on lung health, gender specificity, and how long these effects last.

NIH (HL151769, HD127237, HD071731, and HL152915) and TRDRP (23RT-0018; 27IP-0050; and T29IR0737).

1University of Colorado, Aurora, CO

2University of Colorado Denver – Anschutz Medical, Aurora, CO

Pulmonary hypertension (PH) is a life-threatening condition that affects infants, children, and adults. However, treatment strategies are limited, and morbidity and mortality remain significant. We have previously demonstrated in robust animal models that serotonin (5-HT) contributes to the pathogenesis of experimental neonatal PH and know that infants who died due to severe lung disease have a 34-fold increase in lung 5-HT. We designed an exploratory pilot study to test the hypothesis that systemic 5-HT is increased in infants with persistent pulmonary hypertension of the newborn (PPHN).

Near term and term infants (≥36 weeks) were recruited from the NICUs at Children’s Hospital Colorado and University of Colorado Hospital beginning in March 2021. Infants with culture proven sepsis, metabolic/genetic abnormality, major cardiac defect, renal failure, or antenatal exposure to SSRIs were excluded. PH was defined on echocardiogram by an estimated systolic pulmonary artery pressure ≥40 mm Hg, end-systolic eccentricity index ≥1.16, or presence of a right-to-left shunt. 5-HT is an unstable neurotransmitter that degrades quickly; thus, we measured its more stable metabolite 5-hydroxyindolacetic acid (5-HIAA). Urine samples were collected on DOL 1 and DOL 3, and 5-HIAA was analyzed via mass spectrometry. Monthly follow up samples were collected if PH persisted. Demographics, clinical characteristics, and interventions were obtained through chart review and summarized for the patient cohort. 5-HIAA levels were summarized using medians and ranges.

To date, 6 infants with PPHN and 7 age-matched controls have been enrolled. 54% were male and 46% female. The mean gestational age was 38.2 weeks. All infants with PPHN were classified as having severe PH on initial echo. 83% were born with congenital diaphragmatic hernia (CDH) and 100% had a patent ductus arteriosus (PDA). 83% required vasopressors, with 60% initiated in the delivery room. 100% required steroids for blood pressure and/or respiratory support. 83% required pulmonary vasodilators with inhaled nitric oxide and sildenafil being the most common. 50% of infants with PPHN were followed for refractory PH for a mean of 2.67 months. At DOL 1, the median 5-HIAA level was 14.89 (min, max: 14.17, 19.1) in the PPHN group and 14.17 (11.56, 16.63) in the control group. At DOL 3, the median was 18 (8.89, 40.5) in the PPHN group and 15 (10.95, 17.14) in the control group.

This study investigated the association of PPHN with systemic alterations in 5-HIAA. Our current results offer a preliminary description; however, enrollment is ongoing. With additional data we will test our hypothesis that 5-HIAA is significantly associated with PH severity. Ultimately, we aim to establish it as a noninvasive biomarker to follow treatment response and predict the later development of PH in high-risk infants.

1Western University of Health Sciences, Pomona, CA

2Cedars-Sinai Medical Center, Los Angeles, CA

3University of Southern California Keck School of Medicine, Los Angeles, CA

4Mayo Clinic Arizona, Scottsdale, AZ

Alzheimer’s disease (AD) is commonly characterized by pathognomonic amyloid-beta (Aβ) burden in the brain, and recent reports demonstrate the vital role of cerebral vascular pathology in AD development. Given that the retina is a CNS organ amenable to noninvasive imaging, our team previously pioneered retinal curcumin-fluorescence imaging (RFI) and identified a significant correlation between retinal amyloid burden in the proximal mid-periphery (PMP) of the superotemporal retina with cognitive performance and hippocampal volume. The rising hypothesis of vascular neuropathology in AD, coupled with RFI clinical feasibility targeting both vasculature and Aβ, warrants the implementation of both neurovascular and retinal Aβ for early AD detection. Considering the crucial yet unmet need for such multimodal detection models, we used RFI to examine retinal vascular parameters in relation to retinal Aβ in patients with varying neurocognitive status.

29 subjects underwent neuropsychometric cognitive evaluations and quantitative RFI to measure retinal amyloid burden. We also quantified vessel tortuosity index (VTI), inflection index and branching angle from segmented retinal blood vessels. Using linear regression models, we conducted correlation analyses between retinal vascular and amyloid measures in relation to various cognitive domain Z-scores.

Total and PMP retinal amyloid count were markedly increased in patients with cognitive impairment (CI) as compared to those with normal cognition (NC, p = 0.0012). Venous VTI was significantly different across levels of Clinical Dementia Rating (CDR) cognitive scores (p = 0.026). Patients with CI displayed considerably higher combined PMP amyloid-venous VTI index in comparison to NC subjects (p = 0.0068). Increased combined PMP amyloid-venous VTI index significantly correlated with decreased WMS-IV Z-scores (r = -0.537, p = 0.001) as well as with reduced SF-MCS-36 Z-scores (r = -0.338, p = 0.039).

This study reveals that combined PMP amyloid count-venous VTI index may predict verbal memory loss and cognitive-related quality of life performance. Future larger investigations are needed to further refine the practical utility of RFI in a clinical setting.

1University of Idaho, Moscow, ID

2University of Washington School of Medicine, Seattle, WA

In humans, as well as other vertebrates, color vision requires the differential expression of specific cone opsins in photoreceptor cone cells. One model for the regulation of the human long and medium wavelength sensitive (LWS/MWS) opsin tandem array suggests an upstream regulatory region interacts with replicated opsin genes at random, resulting in mutually exclusive expression of a specific opsin. A similar orthologous long wavelength sensitive (lws1/lws2) array in zebrafish provides a good model for study of this regulation. However, our prior investigations into this array suggest that thyroid hormone (TH) and retinoic acid serve as trans regulators in larvae/juveniles (Mitchell et al., 2015, PLOS Genetics; Mackin et al., 2019, PNAS). This study investigates whether cone opsin expression remains plastic to TH treatment in adult zebrafish, where cone distribution is considered stable.

Adult zebrafish (6–18 months old) were treated with NaOH (0.01%, control) or TH (386 nM) for 1 or 5 days. qRT-PCR was performed on homogenized eyes. Whole retinas were treated by hybridization chain reaction in-situ and then analyzed by confocal imaging for mRNA expression.

In adult zebrafish, exogenous TH drastically increased lws1 expression in both 1 and 5 day-treated groups (p<1e-7, 0.01, respectively) while decreasing lws2 expression (p<0.001, 0.001). Other phototransduction-related transcripts (gngt2b, rh2–1) also demonstrated expression changes following TH treatment. Exogenous TH induced a drastic shift from lws2 to lws1 in adult zebrafish, consistent with previous studies of larvae and juveniles.

This shift from lws2 expression to lws1 expression occurs as rapidly as 1 day when exposed to TH, which shows that cones remain highly plastic even into adulthood. Plasticity in spectral sensitivity (to be sensitive to higher wavelengths) in response to TH suggest a role in visual system function well into adulthood. These results oppose earlier models suggesting that regulation between tandemly replicated opsin genes is stochastic and fixed.

Western University of Health Sciences College of Osteopathic Medicine of the Pacific, Pomona, CA

The head-twitch response (HTR) is evoked following stimulation of postsynaptic serotonin 2A (5-HT2A) receptors in the prefrontal cortex (PFC). D-Fenfluramine (FF) is a selective 5-HT releaser, it produces the HTR via release of serotonin from nerve terminals through the 5-HT uptake carrier working in reverse. Methamphetamine (MA) is a non-selective releaser of monoamines 5-HT, norepinephrine (NE) and dopamine (DA). We investigate whether pretreatment with either MA (1–5 mg/kg, i.p.) or the 5-HT2A receptor selective antagonist EMD 281014 (0.001, 0.005, 0.01, 0.05 mg/kg, i.p.) can alter: 1) the mean frequency of FF-induced HTR at different ages (20-, 30- and 60-day old), and 2) the expression of c-fos evoked by FF in different regions of the PFC. We also explored whether blockade of serotonergic 5-HT1A- or adrenergic α2-receptors can alter the effect of MA on FF-induced HTR across the above ages.

The HTR was observed for 30 min following the injection of FF in each mouse. We use immunohistochemistry study to evaluate the changes of c-fos expression in the PFC.

Pretreatment with MA (1–5 mg/kg, i.p.) dose-dependently suppressed the FF-induced HTR across different ages. MA at 1 mg/kg in 20- and 30-day old mice, and at 5 mg/kg in 60-day old mice significantly suppressed the FF-induced HTR. Pretreatment with EMD 281014 (0.001, 0.005, 0.01, 0.05 mg/kg, i.p.) also blocked the FF-induced HTR in an age- and dose-dependent manner. The selective 5-HT1A receptor antagonist WAY 100635 (0.25 mg/kg, i.p.) and the adrenergic α2-receptor antagonist RS 79948 (0.1 mg/kg,i.p.) significantly reversed the inhibitory effect of MA on the mean frequency of HTR in 20-day old mice, but not in 30- and 60- day old mice. Moreover, FF significantly increased c-fos expressions in several PFC regions in 30-day old mice. Despite the inhibitory effect of MA or EMD 281014 on FF-induced HTR, pretreatment with either MA (1 mg/kg, i.p.) or EMD 281014 (0.05 mg/kg, i.p.) significantly increased c-fos expression in different regions of the PFC in 30-day old mice.

The inhibitory effect of MA on the FF-evoked HTR appears to be mainly due to functional interactions between the stimulatory 5-HT2A - and the inhibitory 5-HT1A- and/or adrenergic α2-receptors. The MA-induced increase in c-fos expression in different PFC regions is probably due MA-evoked increases in synaptic concentrations of 5-HT, NE and/or DA. EMD 281014 failed to prevent the increase in c-fos expression induced by FF, which may be due to the increased 5-HT synaptic concentration that activates other serotonergic receptors, such as, 5-HT1A.

2University of Arizona, Tucson, AZ

3Des Moines University College of Osteopathic Medicine, Des Moines, IA

4Banner University Medical Center Tucson, Tucson, AZ

Geographic atrophy (GA) is a severe and poorly understood progression of dry age-related macular degeneration (AMD). Patients with GA are also more likely to develop choroidal neovascularization. Carbidopa-levodopa treatment has demonstrated successful reduction in neovascular AMD. In this study, we investigate the effects of carbidopa-levodopa treatment on progression of GA.

A retrospective analysis of patients with already existing GA who participated in our proof-of-concept study was performed. Fundus autofluorescence (FAF) and optical coherence tomography (OCT) were utilized to confirm the presence of geographic atrophy. This study followed the 2018 retina consensus meeting requirements to measure geographic atrophy markers in patients. The primary outcomes measures were complete retinal pigment epithelium and outer retinal atrophy (cRORA), hypertransmission through Bruch’s membrane, and mm/year change from initiation of study drug.

We included 5 patients with already existing geographic atrophy in 6 eyes. This cohort consented to carbidopa-levodopa treatment and was predominantly male (3 patients) with a median (IQR) age of 82 (5). The mean GA change in cRORA 1 year before and after treatment initiation was -0.000433 mm/year and 0.0061 mm/year. The mean GA change in hypertransmission 1 year before and after treatment initiation was 0.0085 mm/year and 0.135 mm/year.

Geographic atrophy progressed in all eyes except one. According to a 2021 Ophthalmic Research meta analysis, the average GA growth rate seen across 23 studies was 0.33 mm/year. Thus, our results indicate that the carbidopa-levodopa treatment provides benefit in slowing progression rates of GA. Further studies are indicated into the pathogenesis of GA and the role that carbidopa-levodopa might play in its treatment.

1Western University of Health Sciences, Pomona, CA

2Western University of Health Sciences College of Osteopathic Medicine of the Pacific, Pomona, CA

3The University of Arizona College of Medicine Tucson, Tucson, AZ

This study assesses medical and physical therapy students’ knowledge level in concussion symptoms, diagnosis, and treatment. Parameters we evaluated included how education level, sports background, and concussion history influenced students’ concussion knowledge. The study assessed how these students learn about concussions and whether gaps in knowledge exist. The ultimate goal is to use the survey results to help educators better prepare medical and physical therapy students for patient care.

The first phase of our study involved sending a 14-question electronic survey to osteopathic medical schools across the United States, which assessed demographics, concussion knowledge level, source of concussion education, and interest in curriculum-based learning. The second phase consists of sending a similar electronic survey that expanded to allopathic and physical therapy schools. This survey consisted of 16 questions, with 2 additional demographic questions inquiring about gender and type of pursued degree.

Preliminary collection of over 800 responses and analysis of the data show that 60.2% of MD, DO, and physical therapy students played sports in either high school, college, or professionally. In addition, 42.7% of participants reported sustaining at least one concussion throughout their lives. 26.9% of our participants reported learning about concussions through non-academic means, while 70% reported learning via academic means such as through lectures, literature reviews, or clinical rotations. Our results showed that 80% of our participants agreed they would like more formal education on concussions.

While data collection is still ongoing, the preliminary results of our study indicate that having a sports background or personal experience with concussion may influence their knowledge in concussion diagnosis and treatment. A large percentage of our participants learned about concussions through non-academic methods. While data is forthcoming, this may indicate that an alternative means to learning about concussions is through a sports background and/or concussion history. Participants agree that in order to solidify or supplement concussion knowledge, more education is needed to best prepare rising health care professionals in clinical settings.

1Western University of Health Sciences, Pomona, CA

2Cedars-Sinai Medical Center, Los Angeles, CA

3Mayo Clinic Arizona, Scottsdale, AZ

Andexanet alfa was FDA approved in May 2018 to reverse anticoagulant effects of Factor Xa inhibitors like Apixaban and Rivaroxaban, thereby generating pro-thrombotic mechanisms. Thromboembolic complications within 30 days of Andexanet alfa administration have been reported. Here, we present for the first time a thrombotic cerebral event that appeared immediately after Andexanet alfa infusion in a patient with acute intraventricular hemorrhage (IVH).

A 73-year-old man presented to our emergency department with sudden onset of a severe headache. Head CT demonstrated 2.4 mL of IVH. CT angiogram showed 60% stenosis of the left supraclinoid internal carotid artery (ICA). The patient had been taking 5 mg Apixaban twice daily for atrial fibrillation, with his last dose 5.5 hours prior to presentation. IVH indicated the patient may benefit from anticoagulation reversal via Andexanet alfa.

A 400 mg bolus of Andexanet alfa was administered followed 30 minutes later by a 2-hour infusion of an additional 480 mg, immediately upon which the patient exhibited global aphasia, temporarily alleviated by head-of-the-bed flattening. A left ICA territory mismatch (342 mL) and 76 mL core infarct were observed on CT perfusion. Shortly afterwards, the patient developed a persistent and severe left middle cerebral artery (MCA) stroke syndrome with NIH stroke scale (NIHSS) score of 23. Emergent cerebral angiogram was then performed, revealing a new sizeable thrombus in the left cervical ICA. Successful thrombectomy yielded a resulting TICI score of 2B. However, neurologic status remained poor due to development of a large left MCA territory infarct, and the patient’s family chose to withdraw supportive care.

Our observation of a thrombotic event induced immediately after Andexanet alfa challenges current administration guidelines. The ANNEXA-4 phase 3 trial reported thrombotic events within 7 days of Andexanet alfa administration in 4% of subjects and within 30 days in 10%. The earliest event in literature was noted 1 day following treatment. Also, Andexanet alfa appears to have a higher thrombotic risk than other reversal agents like four-factor prothrombin complex concentrate by as much as 7%. Further elucidation of its effects on the coagulation cascade are warranted to improve safe clinical practices. Interestingly, no thrombotic events were reported in ANNEXA-4 patients who restarted anticoagulation protocols. Thus, it may be advisable to monitor patients closely and broadly for thrombotic events until future studies update protocols for resuming anticoagulation therapy after reversal treatment.

University of Utah Health, Salt Lake City, UT

Acute Respiratory Distress Syndrome (ARDS) is characterized by hypoxic respiratory failure, multi-organ dysfunction, and mortality. ARDS results from inflammatory alveolar injury precipitated by direct and indirect lung injury. Neutrophils play a central role in the pathology of ARDS and release neutrophil extracellular traps (NETs) to trap and kill pathogens. Dysregulated NET formation, however, can cause inflammatory tissue damage and exacerbate acute lung injury as in COVID-19 associated ARDS. Whether NETs participate pathogenically in non-COVID-19 associated ARDs remains unknown. We hypothesized that plasma NET levels correlate directly with disease severity and mortality in non-COVID-19 ARDS patients.

We obtained previously collected plasma samples from patients (n=200) with moderate to severe ARDS enrolled in the Re-evaluation of Systemic Early Neuromuscular Blockade (ROSE) trial at three different time points (admission, 24 hours, and 48 hours after admission) complete with clinical outcome data through 28 days after admission. We also examined age- and gender-matched healthy donor plasma (n=20). We assayed cell-free DNA levels via fluorescence as a surrogate for NETs in each plasma sample. Clinical outcomes from ROSE trial participants were correlated with the quantification of NETs. We also assessed NET formation by neutrophils isolated from healthy adults following incubation with ARDS patient and healthy donor plasma samples using live cell imaging and confocal microscopy.

We demonstrated elevated cell-free DNA in ARDS plasma compared to healthy donor plasma. Deceased study participants demonstrated higher plasma cell-free DNA levels on admission and at 48 hours as compared to ARDS survivors (admission: p = 0.0045 and 48 hours: p = 0.0050). Increased cell-free DNA on admission, at 24 hours, and 48 hours also correlated with illness severity. Furthermore, ARDS plasma samples induced NET formation in vitro in neutrophils isolated from healthy donors while control plasma did not.

NET formation is increased in plasma from patients with ARDS compared to healthy donor plasma, consistent with the inflammatory alveolar injury seen in ARDS. Additionally, plasma from ARDS patients induces NET formation in vitro in PMNs isolated from healthy adult donors. We speculate that exaggerated NET formation may serve as a novel biomarker for inflammatory lung injury in ARDS resulting from multiple etiologies and strategies targeting NET formation may improve outcomes in ARDS.

1University of Colorado Denver School of Medicine, Aurora, CO

2University of Colorado Denver Department of Medicine, Aurora, CO

Pulmonary Hypertension (PH) is a life-threatening disorder characterized by increased pulmonary vascular resistance, right ventricular systolic pressures (RVSPs) and right ventricular hypertrophy (RVH), driven in part by inflammation. Our previous studies have demonstrated that platelets are activated in mice with hypoxia exposure, leading to the release of the proinflammatory chemokines Platelet Factor 4 (PF4) and CCL5, contributing to hypoxia induced lung inflammation. Nbeal2 KO mice are platelet alpha-granule deficient. Alpha granules contain numerous chemokines, including PF4 and CCL5. We hypothesized that Nbeal2 KO mice would be protected from hypoxia-induced PH.

Male and female C57BL/6 and Nbeal2 KO mice were exposed at 8–9 weeks of age to 10% hypobaric hypoxia or remained in normoxia for 21 days. Whole blood was collected via RV cardiac puncture using heparin coated syringes and analyzed immediately. RVSPs were obtained by closed-chest RV puncture. Hearts were dissected to obtain the weights of the RV and septum + left ventricle (LV). Fulton’s index, (RV/LV+S) was used to determine RV hypertrophy as an indicator of the development of PH.

Nbeal2 KO mice have lower platelet numbers (434 ± 43.5 - 780± 39.6 [x103/uL], p<0.001) and larger platelets, demonstrated by increased mean platelet volume (MPV) (4.7 ± 0.05 - 5.3 ± 0.04 [fL], p<0.0001) compared to WT controls. RVSP under control conditions was similar in Nbeal2 KO and WT mice (28.14±0.91 – 31.04±1.01 [mmHg]). There was a significantly greater hypoxia-induced increase in RVSP in Nbeal2 KOs compared to WT mice (34.5 ± 1.05 - 37.8 ± 0.66 [mmHg], p<0.05). Though, we saw statistically higher Nbeal2 KO RVSP compared to WT, they both showed a 22% increase in RVSP with hypoxia exposure (WT CO vs. HPX 28.1 ± 0.91 - 34.5 ± 1.1 [mmHg], p<0.001; KO CO vs. HPX 31.0 ± 1.01 - 37.8 ± 0.66 [mmHg], p<0.0001). Fulton’s index under control conditions was similar between Nbeal2 KO and WT mice. (0.29±0.014 – 0.29±0.001). As expected, WT mice show the development of RVH when exposed to prolonged hypoxia (0.29 ± 0.01 - 0.39 ± 0.02, p<0.001). Nbeal2 KO mice did not develop hypoxia-induced RVH (0.29±0.001 – 0.32±0.02).

Mice deficient in alpha granules (Nbeal2 KO) have similar hypoxia-induced pulmonary vasoconstriction, but are protected against the development of RVH. Our future studies will address whether Nbeal2 KO mice demonstrate impaired platelet activation and/or decreased recruitment to the pulmonary circulation conferring protection from inflammatory mediated pulmonary vascular remodeling and PH.

University of California Davis, Sacramento, CA

Cardiac output (CO) monitoring is an important tool for hemodynamic optimization. Bolus thermodilution (iCO) with a pulmonary artery catheter (PAC) remains the gold standard for CO measurement, but is invasive and has been associated with complications. This study evaluates the level of agreement of CO values measured from multiple minimally-invasive CO monitor systems before and after cardiopulmonary bypass (CPB). CCO uses a modified thermodilution technology. Cheetah is based on thoracic bioreactance. ClearSight reconstructs the brachial arterial pressure waveform from the finger arterial pressure. CNAP CO is based on continuous non-invasive arterial pressure from the finger. LiDCO is based on the radial arterial blood pressure waveform. FloTrac calculates CO from the radial arterial pulse contour.

The IRB reviewed and approved this quality improvement study. Sixty patients were enrolled. 8 patients were excluded due to missing iCO measurements. CO measurements from 52 patients were evaluated using Bland-Altman analysis. CO values were measured simultaneously by bolus thermodilution with a PAC and the CO monitors listed above.

All values were not available at all time points. The Bland-Altman plots are presented in figure 1 and the corresponding values are summarized in table 1.

Bland-altman analysis of minimally-invasive CO monitors

Bland-altman analysis of minimally-invasive CO monitors pre-bypass and post-bypass

Based upon percentage errors, the relative accuracy of the minimally-invasive CO monitors when compared to iCO were: CCO>Cheetah>ClearSight>FloTrac>CNAP>LiDCO. Measurements after CPB have slightly smaller percentage of errors. Percentage error <30% is considered acceptable (Critchely et al., 1999). On this basis, the minimally-invasive CO monitors cannot replace the PAC for accurate CO measurement in cardiopulmonary bypass surgery.

Dwight David Eisenhower Army Medical Center, Fort Gordon, GA

Dabbing is an emerging form of cannabis consumption. Similar to vape-associated lung injury, it can result in acute respiratory distress syndrome (ARDS), and diagnosis is often obscured by a broad infectious differential.

The patient is a 36 year old male with recurrent admissions for various infections in the setting of a splenectomy who was admitted for an undifferentiated inflammatory syndrome. He presented with fever, leukocytosis with bandemia, several episodes of emesis, and fatigue. He denied any cough, dyspnea, sputum production, or chest pain. Aside from fever, his vital signs were otherwise normal and he was admitted to the general medical ward. Given his immunocompromised state and lack of clear infectious source he was started on vancomycin, azithromycin, and meropenem. Despite this regimen his fever persisted, leukocytosis worsened, and hypoxia developed; the latter rapidly progressed requiring intubation for hypoxemic respiratory failure. CT scan of the chest revealed bilateral consolidation and ground glass opacities. He met Berlin criteria for ARDS and he was started on glucocorticoids. Cytology from bronchoscopy revealed abundant alveolar macrophages with oil red O staining and mixed inflammatory cells. Comprehensive infectious workup including Covid-19 PCR x2, 3 sets of blood cultures, urine culture, and bronchioaveolar lavage for bacteria, acid fast organisms, and fungus was negative. This negative infectious workup coupled with cytology findings indicates lipoid pneumonia as the etiology of his illness. His clinical condition improved after several days and he was successfully extubated and eventually discharged on room air. It was discovered that the patient had been vaping with THC products including dabs. Dabs, a wax-like THC product, are likely the mechanism by which lipids were introduced into his lungs resulting in pneumonia.

This case represents a presentation of lipoid pneumonia secondary to ‘dabbing,’ a relatively novel form of ingesting cannabis. There have been few reported cases of respiratory failure secondary to ‘dabbing,’ and this case identifies lipoid pneumonia as the cause of lung injury. This case highlights the need for physicians to be aware of specific forms of recreational drugs and routes of delivery.

University of California Davis, Davis, CA

Signaling in lung epithelial cells plays a role in respiratory disease pathogenesis. ERK, NFκB, and AMPK are key kinases regulating cell growth and proliferation that are implicated in airway inflammation and disease. Importantly, ERK and AMPK display heterogenous and temporally dynamic signaling activity that can be linked to cell behavior but has yet to be investigated in the context of airway disease. We hypothesize that unique signatures of short-term oscillatory signaling activity (minutes) differentially regulate long-term (>24 hour) inflammatory responses in part via regulation of the transcription factor STAT3 at the cellular level.

2022 Western Medical Research Conference | Journal of Investigative Medicine

Medical Use ECG Paper Thermal ECG Paper Using fluorescent biosensors and live-cell imaging, we track single-cell kinase signaling activity in our Human Bronchial Epithelial (HBE1) cell line, and primary human bronchial epithelial cells (pHBE), continuously at 6-minute intervals, both in submerged and Air-Liquid Interface (ALI) culture conditions. Computational image analysis extracts kinase signaling activity profiles in response to growth factors, and inflammatory cytokines. After 24 hours of ligand exposure, cells are fixed and immunofluorescent stain for nuclear pSTAT3 is performed to measure cellular inflammatory response.